Lets do away with video cards altogether!

david__t

Distinguished
Feb 2, 2003
200
0
18,680
If we are going to need more and more power then instead of filling up all of our motherboard slots with video boards (check out Toms latest Quad SLi article for a very graphic example) and leaving no space for any other add in cards, why don't we have a GPU socket on our motherboards instead?

Just think about it - now AMD has ATi, it is the perfect opportunity. You could have a similar system to the way the Xbox works with a large chunk of unified RAM for the system to call on. It would have to be very fast memory of course, but I think this would be a great solution.

Also this would be a much better way to offload physics calculations to the processor with the spare capacity (the CPU & GPU could have a super fast Hyper transport type link). And think of the piece and quiet if you could use full size CPU heatsinks on a GPU instead of a loud, thin solution?

Think of it as suped up on board graphics - heck it would be just like multi CPU motherboards which have been around for years.
 

MarkG

Distinguished
Oct 13, 2004
841
0
19,010
why don't we have a GPU socket on our motherboards instead?

We do. It's called 'AGP' or 'PCI-Express x16'.

It would have to be very fast memory of course, but I think this would be a great solution.

No, it would be retarded: there's a reason why GPUs have 256-bit 1.5GHz memory buses and CPUs don't.
 

Gary_Busey

Distinguished
Mar 21, 2006
1,380
0
19,280
The GPU isn't the only thing that changes from one generation of video card to another. You would be limiting the ability of the gpu because you'd have the same circuitry in the board as you did with the previous generation. It's just not a good idea.
 

Cheese

Distinguished
Jan 15, 2005
122
0
18,680
You insult Gary_Busey's idea then you insult him which means you insult the hood foo.

Don't make me get illy and pull the 9milly.
 

Gary_Busey

Distinguished
Mar 21, 2006
1,380
0
19,280
Yeah we know, we're just playing with ya. I agree, both ideas are stupid. Graphics cards, so far, are the best, most effecient and scalable way to deliver computer graphics.
 

nigelf

Distinguished
Feb 10, 2006
171
0
18,680
i agree with most things sayed before. what would in my oppinion make sense if you want to retain the ability to have many add-in cards is to have the gpu mounted outside or somewhere out of the way and have it connected via a cable (but then the bus would need to be very fast and have a low latency; like an external-pci-e)
 

mr_fnord

Distinguished
Dec 20, 2005
207
0
18,680
Graphics cards, so far, are the best, most effecient and scalable way to deliver computer graphics.

The key to that phrase is 'so far'. When we've all got 4 sockets with 32 cores apiece sharing 4GB of unified cache at 16Ghz, the video card will be obsolescent, mofo.

Wasn't that also the point with the Cell processor? 9 cores with a memory structure optimized for pipelined calculations, but also capable of traditional processing as well?
 

Slava

Distinguished
Mar 6, 2002
914
0
18,980
Ahem... it is like no one has heard about integrated graphics... Intel Extreme Graphics anyone?

Hmmm... what the original poster proposes is basically making the GPU of an integrated graphics card upgradeable... He forgets though that often new generation of GPUs uses a different (more advanced) manufacturing process, gets larger (like more transistors) and most likely will not fit into the socket designed for the previous generation...

While integrated graphics will always suck compared to stand alone ones (because as someone mentioned above GPU is not the only thing that matters), being able to upgrade the GPU of an integrated video card is not such a terrible idea per se.
 
The other thing to consider is he cost of making a mobo that supports that level of upgradability. The number of layers and the amount of traces to support the memory alone would be somewhat prohibitive, and like you mention, socket support (could make it happe from one design but then you lose interface efficicencies because of it.

High end VPUs will always be add ons seperate from the mainboard, at least until we do have so much computing power that there's no longer a need. But that's even further off into the future and likely once we have that level of CPU/GPU (general processing unit?) power we'll likely be moving away from normal/current design restrictions.
 

chenBrazil

Distinguished
Mar 17, 2006
136
0
18,680
all depends on how addicted you are... you can live with Nvidia 6100 onboard solution or you may need "octo-SLi " for living... cheap, good enough, maximum performance, and status rarely converge... except if you don't use computers or if you really don't care about price... so your idea is reality for onboard users... and are nightmare for the Ubber users...
It is hard to make everyboady happy...
 

chewbenator

Distinguished
Jul 5, 2006
246
0
18,680
It would either crowd the motherboard causing heat increases or larger boards. Heat from the graphics chip alone would cause the surrounding area on the board to heat up. With the current vertical state of graphic card slots you move heat up and away from the board. Also, if you integrated the graphics chip into the board you would have to install and purchase, or have integrated graphics ram. This could increase costs of ram if both system and graphics ram were built together, not to mention faster DDR2/3/4 ram on video cards.
 

JonathanDeane

Distinguished
Mar 28, 2006
1,469
0
19,310
I kinda like the idea of an external box that contains a power supply and the graphics hardware needed. I know this would be far more expensive then AGP/PCIe cards but it does sound nice :)

They already have a socket on the mobo for graphics and they do well for running windows ! perfectly fine for doing work if you ask me :) my laptop couldnt play Half Life 2 to save its life but thats ok I still enjoy it for what its used for !

MS should have just made a port on the back of the 360 to connect a PC too and they could have sold units as combo's Graphics card and Game machine connected to a monitor in one ? Hmmmm only problem is getting enough bandwidth to the 360.... I dont think people would want to wait for a DVD game to send over USB even if it was streamed lol
 

david__t

Distinguished
Feb 2, 2003
200
0
18,680
My Sig refers to the fact that I had a Olivetti PC (8088) which ran at 4.77 MHz and at the time I made that sig, Tom had just done an article on overclocking the P4 to 4GHz which was 10 years later - I didn't mean that was the exact amount of time that it really took for Intel or AMD to get from one speed to the other - it was more personal than that.
 

david__t

Distinguished
Feb 2, 2003
200
0
18,680
why don't we have a GPU socket on our motherboards instead?

We do. It's called 'AGP' or 'PCI-Express x16'.

Har Har, very funny - I never knew those existed - thanks for sharing.

It would have to be very fast memory of course, but I think this would be a great solution.

No, it would be retarded: there's a reason why GPUs have 256-bit 1.5GHz memory buses and CPUs don't.

I didn't say that this would neccesarily be possible with current technology, but clearly if the CPU did have this memory bandwidth then it could be done.

Also to those who say that it would be impracticle to upgrade the GPU, just remember what happened during the socket 7 / socket 462 days. Socket 7 started at around 150MHz and continued right up to a K62-550 all on the same board. Then similarly the Socket A boards lasted for ages and only needed changing due to memory technology upgrades.

I understand that GPUs do have very large sweeping changes sometimes, but as far as pure speed is concerned CPUs have shown us that you can upgrade on the same motherboard for quite a while.

Personally I have heard some people mentioning an external box which would be quite good but would need a very fast connection to the PC which might be difficult. Still nice idea to add in to the pot.

My main gripe with the current trend is that whilst CPU manufacturers have found ways to increase power in 1 socket, the video card makers have struggled - first they started stealing PCI slots with large cooling solutions and now they are stealing physical slots with extra cards.

Personally I hope that all games will be programmed to run best on 1 powerful single card solution at around 1280 x 1024 - and getting SLi will not bring better graphics but will just enable higher resolutions. Forcing users to get more than one GPU just to see all effects in a game is a dangerous path to start down.

I will now put on my flame suite :)
 

ScottyHutch

Distinguished
May 12, 2006
64
0
18,630
I would like to see ATI and nVidia focus on making low-power versions of the cards. I would be willing to pay extra for a low-power version card if it means I can save the cash otherwise spent on cooling the system down.