Sign in with
Sign up | Sign in
Your question

"Consoles as We Know Them are Gone", the bright future of PC Gaming

Tags:
  • PC gaming
  • Video Games
Last response: in Video Games
Share
March 22, 2008 5:37:25 AM

http://www.extremetech.com/article2/0,1697,2277507,00.asp

Here's a great article about PC gaming and it's bright future!!

I'd love to hear what everyone thinks.

I'm too tired to write my own thoughts on this right now but I will in the next couple days.

http://www.extremetech.com/article2/0,1697,2277507,00.asp

More about : consoles bright future gaming

March 22, 2008 6:20:48 AM

Nice read, thanks for the link.

It certainly looks like this guy knows what he's talking about. Only at the end of part 2 the interview suddenly cuts short, just as they are discussing the merging of CPU and GPU (again).

There he is talking about Intel and AMD getting it right and at this particular point I was wondering why he did not spend some time on the fact that AMD acquired ATI, and what is going to happen to NVidia? He probably 'knows' and has some personal stake in it (shares in the right companies).
March 22, 2008 11:27:38 AM

Kind of rambling and repetitive. He makes some good points, but it really seems he did not prepare ahead of time for the interview and just kind of stumbled around before he got to it.
Related resources
March 22, 2008 5:02:54 PM

Quote:
I was wondering why he did not spend some time on the fact that AMD acquired ATI, and what is going to happen to NVidia?


I was wondering the same thing; how can you talk about CPUs and GPUs coming together without talking about AMD/ATI and NVidia?

Another very interesting point that was made in this article, was about the future of retail. It's hard to argue the fact that gaming will start becoming more virtual. I know from my past experience with EB Games that, upper management over there is truly concerned about downloadable content. Downloadable content aside, I have a hard time believing that the big retailers would stop selling hardware when they make so much money off of the accessories. The mark up can easily reach 50% on controllers and over 80% on cables...PS the mark up on monster (crap) cables is around 90%...Here's a number out of EB: Monster Cable accessories where only 1% of EB Games' sales, but accounted around 10% of their overall profit! I could see this really kind of future hurting a company like gamestop, but the big guys will still sell gaming hardware even if all games where sold online.

Anyways, I would love to hear more on the new CPU technology that was being discussed in this article.

I'm not sold on the "console on the PC" idea, however; anything that can make PC games less expensive by cutting down on piracy, there by increasing the market sounds great to me!

March 22, 2008 6:41:59 PM

While it may be true that the annual revenue for the platform is higher, this is thanks only to World of Warcraft. This article is not an accurate representation as the market as a whole. In fact, if World of Warcraft were excluded, it's likely the PC as a gaming platform would fall behind the Wii, Xbox360, PS3, PS2, PSP and DS. He's simply "preaching to the choir" with this interview.
March 22, 2008 6:49:12 PM

You have to keep in mind, there is a limited amount of money that will go toward games. If enough money is tied up in WoW to account for the vast sales difference between PC and consoles, then that easily explains the slump. There is just less money and a smaller audience available for those titles since a heavy percentage of PC gamers are busily playing WoW and paying for WoW instead of everything else.
March 23, 2008 3:24:34 AM

Thanks for the link, that was a great interview. I esspecially like the "vista blows" interview that followed. I learned a few things about operating systems and how 3D games are made and intended to run.
March 23, 2008 3:58:14 AM

FaceLifter said:
Quote:
I was wondering why he did not spend some time on the fact that AMD acquired ATI, and what is going to happen to NVidia?


I was wondering the same thing; how can you talk about CPUs and GPUs coming together without talking about AMD/ATI and NVidia?

Another very interesting point that was made in this article, was about the future of retail. It's hard to argue the fact that gaming will start becoming more virtual. I know from my past experience with EB Games that, upper management over there is truly concerned about downloadable content. Downloadable content aside, I have a hard time believing that the big retailers would stop selling hardware when they make so much money off of the accessories. The mark up can easily reach 50% on controllers and over 80% on cables...PS the mark up on monster (crap) cables is around 90%...Here's a number out of EB: Monster Cable accessories where only 1% of EB Games' sales, but accounted around 10% of their overall profit! I could see this really kind of future hurting a company like gamestop, but the big guys will still sell gaming hardware even if all games where sold online.

Anyways, I would love to hear more on the new CPU technology that was being discussed in this article.

I'm not sold on the "console on the PC" idea, however; anything that can make PC games less expensive by cutting down on piracy, there by increasing the market sounds great to me!



speaking of monster cables,,reminds me that using gold for electrical connectors is no good,resistance to high,,what a con...:) 
March 23, 2008 9:22:33 AM

After reading this interview Im stunned. Ive been saying the same thing for a long time here almost word for word. Im glad somebody sees this the way I do. Intel has hurt the PC gaming industry, and whats good to say about Vista? Especially concerning gaming. What Im worried about is the influence Intel has, and since its heading into the graphics market, are they going to screw it around, putting it second to the all mighty cpu? Or have it done their way, maybe excluding say raytracing and going with something else? Something that already benefits their plans> Their hardware? Someone told me, well theyre (M$ and Intel) spending 20 million on multthreaded aps, well whoooopie, whats that? A half a game? At least someone thats been there and done that, thats a mover in the industry sees all this for what it is, and maybe will help shape the future of PC gaming
March 31, 2008 6:25:30 PM

FaceLifter said:
I was wondering the same thing; how can you talk about CPUs and GPUs coming together without talking about AMD/ATI and NVidia?


It really doesn't matter, because the whole thing is silly. You'll always be better off with two chips than one, except at the bottom end of the market where current integrated graphics sit and cost is the primary factor.

There is simply no good performance reason to put a GPU and CPU in the same chip, and I don't know what he's been smoking if he thinks that it will magically make games 5-10x faster. You have less transistors available and far less memory bandwidth; that's never going to compete against a two-chip CPU/GPU combination unless it's crippled by a very low speed bus between the two. The only benefit is that if you put a simple GPU into a low-end CPU you can potentially save a few bucks by removing one external chip from the motherboard.

I do tend to agree with him about consoles though; right now, selling high-end consoles seems to be just a license to lose money. Most of the high-end hardware is driven by the PC market, and sticking PC components in a box and selling below cost price doesn't seem like a good business plan.
March 31, 2008 8:11:50 PM

Im thinking in much larger terms when it comes to cpu/gpu package. Like 8/16 in a package, being together, elimating bus issues mostly, and at a lower node, itll work
March 31, 2008 9:44:00 PM

JAYDEEJOHN said:
Im thinking in much larger terms when it comes to cpu/gpu package. Like 8/16 in a package, being together, elimating bus issues mostly, and at a lower node, itll work


And it will still be slower than a 16-core CPU talking to a 32-core GPU on a separate card, each using the same number of transistors as your single combined chip. Nothing will change the fact that having twice as many transistors is going to give significantly faster processing, particularly when coupled with two or more times the memory bandwidth; that's precisely why we stopped using CPU-powered VGA graphics all those years ago.
March 31, 2008 10:09:40 PM

Im not talking cpu, but parrallel gpus, where many is more, combined properly WITH a cpu(s)
March 31, 2008 10:13:03 PM

Then of course theres raytracing...
March 31, 2008 11:05:24 PM

JAYDEEJOHN said:
Im not talking cpu, but parrallel gpus, where many is more, combined properly WITH a cpu(s)


And again, it will still perform worse than two chips of the same size on different boards, each with their own memory. The bus between the CPU and GPU is nowhere near the limiting factor, whereas memory bandwidth and/or raw pixel processing power usually is, so putting the GPU in the CPU with half the transistors and a fraction of the memory bandwidth makes absolutely no sense for a performance system. In addition, of course, it makes upgrading your graphics impossible without also upgrading your CPU, which is great for CPU manufacturers, but no so good for the rest of us.

I'm really not sure why this is so hard to understand.
April 1, 2008 7:48:40 AM

MarkG said:
And again, it will still perform worse than two chips of the same size on different boards, each with their own memory. The bus between the CPU and GPU is nowhere near the limiting factor, whereas memory bandwidth and/or raw pixel processing power usually is, so putting the GPU in the CPU with half the transistors and a fraction of the memory bandwidth makes absolutely no sense for a performance system. In addition, of course, it makes upgrading your graphics impossible without also upgrading your CPU, which is great for CPU manufacturers, but no so good for the rest of us.

I'm really not sure why this is so hard to understand.


Mark.. why do you think jaydeejohn brought up raytracing/volumetric rendering? If you want to do that efficiently then you will need quick access to large chunks of shared memory or a very quick way to tranfer data from CPU mem to GPU. That kind of render techniques require the whole volume representation that is going to be rendered to be in memory.

Btw, current GPU architecture is not suited for volumetric rendering but the current architecture is getting near its completion with what can be done in realistic imaging today. If you want to have the kind of quality in (realtime) graphics that you see in movies today (not realtime), then we will need to move on, and I'm sure we will see this happening in the coming years and I am sure that it will involve getting the new VR GPU a lot closer to the CPU (sharing memory).

And I do understand that this is (perhaps) somewhat hard to understand, depending on how much people know about graphics rendering.
April 1, 2008 12:01:44 PM

My main concern with combining CPU and GPU into a single chip is that this new chip will be many times harder to manufacture and prices will go up. Putting two separate chips in the same package would alleviate that to a large degree.

I don't see though why we could not reinvent our bus system instead of reinventing our processing system. Just give the GPU direct access to the CPU memory. On the other hand, why are we assuming that we cannot add a second bus for the GPU side of the chip? Different ways of achieving the same goal.
April 1, 2008 2:33:04 PM

BigMac said:
Mark.. why do you think jaydeejohn brought up raytracing/volumetric rendering?


Because it's the latest buzzword that's not going to go anywhere?

Quote:
If you want to do that efficiently then you will need quick access to large chunks of shared memory or a very quick way to tranfer data from CPU mem to GPU.


Even if you believe that, you've just shown that putting both CPU and GPU on a shared memory bus is a bad idea because it dramatically reduces memory bandwidth (unless you think that Joe Sixpack is going to pay for memory four or more DIMMs at a time so he can have a 256-bit memory bus or more). There is no reason why ray-tracing or volumetric rendering needs substantially more data transfer from CPU to GPU, other than because volumetric rendering uses absolutely massive amounts of RAM.

Quote:
Btw, current GPU architecture is not suited for volumetric rendering


From what I remember, 3Dlabs were doing volumetric rendering on a PC GPU at least five years ago. There's absolutely no reason why it can't be done, it's just not an efficient way of rendering in anything other than specialised markets like medical imaging (where what comes out of your scanner is a volumetric representation of the body).

Quote:
but the current architecture is getting near its completion with what can be done in realistic imaging today.


No, it's roughly a bazillion miles from what can be done with current architectures. Game developers could use 1000x as much power as current GPUs give them.

Quote:
And I do understand that this is (perhaps) somewhat hard to understand, depending on how much people know about graphics rendering.


LOL :) .
April 1, 2008 3:45:14 PM

MarkG said:

Quote:
If you want to do that efficiently then you will need quick access to large chunks of shared memory or a very quick way to tranfer data from CPU mem to GPU.


Even if you believe that, you've just shown that putting both CPU and GPU on a shared memory bus is a bad idea because it dramatically reduces memory bandwidth (unless you think that Joe Sixpack is going to pay for memory four or more DIMMs at a time so he can have a 256-bit memory bus or more). There is no reason why ray-tracing or volumetric rendering needs substantially more data transfer from CPU to GPU, other than because volumetric rendering uses absolutely massive amounts of RAM.


I only write things I believe so do not worry about that (what would be the point otherwise?). By putting them on the same chip other options are available that will not reduce bandwidth. Joe is going to pay for superduper graphics one way or the other, he doesnt care about the technology but he is willing to part with significant money for major improvements.

And exactly the last point is the crux. It seems we agree on that. Volume rendering uses massive amounts of RAM.
MarkG said:

Quote:
Btw, current GPU architecture is not suited for volumetric rendering


From what I remember, 3Dlabs were doing volumetric rendering on a PC GPU at least five years ago. There's absolutely no reason why it can't be done, it's just not an efficient way of rendering in anything other than specialised markets like medical imaging (where what comes out of your scanner is a volumetric representation of the body).

It can be done, it has been done, more accurately we are doing it (guess in what field I'm working :)  ) but it is not efficient with current technology at all. The fact that it only hits specialized markets, has to do (currently) with the fact that visualizing actual 3D data is restricted to those specialised markets at the moment. That is going to change.

MarkG said:

Quote:
but the current architecture is getting near its completion with what can be done in realistic imaging today.


No, it's roughly a bazillion miles from what can be done with current architectures. Game developers could use 1000x as much power as current GPUs give them.


Here is where we differ in opinion and I'm very open as to what you would suggest is possible with current architectures. For me it boils down to two things: 1) volume rendering is the superior way of rendering 2D images from 3D scenes as it approaches/models the way how light is presented to the human eyes. You can do much more convincing rendering of rain for instance (examples of weather or other physical phenomena aplenty). The fact that raytracing is used in all instances of rendering where real-time is not a requirement, says it all.

If you are with me so far, then:
2) Why settle for 100 or 1000x more power and keep up the 2D trickbox of creating 3D effects, while at only 5 or 10x more power (or even less) and a volume rendering optimized architecture we will have realtime raytracing in place?

MarkG said:

Quote:
And I do understand that this is (perhaps) somewhat hard to understand, depending on how much people know about graphics rendering.


LOL :) .


Well glad you like it. I don't, really, but you started it with implying that people are not understanding you while they might just disagree with you. I thought I'd return the favor. I do think you know what you are talking about so you do not have to worry about that, and if you think I do not know what I am talking about, that's your problem, not mine :) 

April 1, 2008 5:59:13 PM

Quote:

Because it's the latest buzzword that's not going to go anywhere?


I hardly think these "buzzword's" are going nowhere, as you put it. But I'm not as knowledgeable as you guys are...at least not yet[I'm in school for game design ;) ]. I'm sure that over the next couple months we will be hearing a lot more on this topic.

I have to say that I'm enjoying this conversation immensely...thanks guys.
April 4, 2008 10:04:33 PM

infornography42 said:
You have to keep in mind, there is a limited amount of money that will go toward games. If enough money is tied up in WoW to account for the vast sales difference between PC and consoles, then that easily explains the slump. There is just less money and a smaller audience available for those titles since a heavy percentage of PC gamers are busily playing WoW and paying for WoW instead of everything else.


Yup. And to tell you the truth WoW isn't all that great. People just get addicted easily. I think once the CPU/GPU merge it will change since PC's will be cheaper and probably have decent onboard GPUs to play most games. Could you imagine paying $600 bucks for a HTPC that will play Crysis at a decent resolution in 2009? Would be nice but we will have to wait and see right?

FaceLifter said:
MS Vista Service Pack 2 to support DirectX 11 and Intel's ray tracing!!

Here's another article to back up the first one.

http://www.techarp.com/showarticle.aspx?artno=526&pgno=...

Could be rumor, but I doubt it. Everyone knew Intel was working on it's Ray Tracing technology.


Yea Intel does have Ray Tracing which is what Larabee is supposed to run on. Thats nice. I was going to get a ATI R770 when it was decently priced but that would only be DX 10.1 and no ray tracing. I guess I might not.

JAYDEEJOHN said:
Im thinking in much larger terms when it comes to cpu/gpu package. Like 8/16 in a package, being together, elimating bus issues mostly, and at a lower node, itll work


I was thinking the same thing. Since the GPU will be on the die or in the package wont that give it a much better interface and more bandwidt to use thus cutting a lot of time between the CPU and GPU?

I think it will and will allow those CPU/GPU combos to give better graphics than everyone thinks no matter if its Intel or AMD.

Well I just hope this is true. I want to see PC gaming take back the crown it deserves. PC's are the reason why games are where they are now and deserves the recognition for it.
April 5, 2008 3:01:59 PM

I agree! :lol: 

I would enjoy the PC more if my Computer was better, but I realize, that with my new 8800 GT, my computer is now more powerful than Microsoft's XBOX 360. Thats not to say I don't like it, but geez.
April 5, 2008 5:37:29 PM

I guess thay I'm a little less encouraged by these developments for the future of what we think of as PC gaming. I am 100% convinced that the next generation of consoles are just going to be proprietary/game oriented PCs from M$ and Sony. SO, I can't see why any tech developments would not just get co-opted into those consoles. I mean I can totally see M$ using some muscle to snatch up some of these techs as M$ X-PC "exclusives".

I suppose that I'm just a little more pessimistic about the rational capabilities of the lowest common denominator. I mean, when faced with a 600-700 performance PC and a similarly-priced box with similar features with the brad "X-Box" or "PS4" or whatever, the average trog is gonna go with his Pavlovian instinct and plunk down the dough for his corporate overlords....
April 7, 2008 6:12:53 AM

Yeah, I see your point Chazwuzzer and to a certain point I agree. I think that what is being pointed out here is that the hardcore gamers are the ones that will be coming back to the PC. These are people that are willing to learn new technologies to have the best gaming experiences that they can. The average/casual gamer as well as the hardcore gamer will still be using consoles but the driving force in high end gaming(graphics) will be coming from the PC. Lets face it, graphics are a major driving force in the video game industry. There will always be a successful console...for anyone to doubt that or even debate it is just silly.
April 7, 2008 8:39:49 AM

FaceLifter said:
MS Vista Service Pack 2 to support DirectX 11 and Intel's ray tracing!!

Here's another article to back up the first one.

http://www.techarp.com/showarticle.aspx?artno=526&pgno=...

Could be rumor, but I doubt it. Everyone knew Intel was working on it's Ray Tracing technology.


Well...

If you have read my previous responses here you know that I think ray tracing is coming to gaming sooner or later but I highly doubt it will be this year. Did you actually bother to see the posting date on this particular piece? And did you read the "Industry comments" on the last page? Pretty obvious to see that this was a Malaysian April 1st prank :) 
April 7, 2008 7:05:25 PM

LOL, that's too funny...(shrugs), at least it made for some good conversation. My guess is that it'll be coming with Windows 7, which we should be hearing more about soon.
April 12, 2008 6:26:39 AM

Well considering that Intel does already actually have ray tracing software or what not I would think it may start to hit by 2009. ID software has a engine that runs on ray tracing.

Heck I think that Larabbe will have ray tracing spport. But thats too be seen as we don't have any info on that yet other than it will be a multi GPU on die card.

I don't know about DX11 though. It could be a Vista start like how XP started with DX8 and went to DX9. Or they could just skip Vista DX1 support and go straight to Windows 7 like they did with DX10 and XP. Who knows as we will have to wait and find out, right?
April 13, 2008 3:34:45 AM

Here's to the future---->clink*
!