Dedicated GPU socket like a CPU's on a motherboard?

BigDKevin

Reputable
Mar 28, 2015
2
0
4,510
Why doesn't a GPU have a socket on the motherboard like that of a CPU's?

Surely if you split the CPU and the GPU to 2 different chips, it would out perform the integrated graphics and the processing portion as they both become dedicated chips.

Just like i3, i5, and i7 share the same socket, why don't they just do the same for graphics chips? The speeds of i3, i5, and i7 are all different speeds but they are able to use the same socket, the socket doesn't hinder anything, theoretically if we built the graphics chips to be the same way, it would work...? Using AMD's HSA technology of sharing of the memory, this sounds idea like it could be possible.

Being dedicated chips, the performance would increase in every direction.
 
Solution
For low end graphics that is done on server motherboards and embedded solutions where a graphics card is impractical. But a high end graphics subsystem -- yes subsystem -- is much more than a chip. You have dedicated memory, sometimes many GB. You have very high power requirements and appropriate heat extraction. You have DACs, audio, power infrastructure, etc. At the high end it is a system, and one that has be to designed as a system.

kanewolf

Titan
Moderator
For low end graphics that is done on server motherboards and embedded solutions where a graphics card is impractical. But a high end graphics subsystem -- yes subsystem -- is much more than a chip. You have dedicated memory, sometimes many GB. You have very high power requirements and appropriate heat extraction. You have DACs, audio, power infrastructure, etc. At the high end it is a system, and one that has be to designed as a system.
 
Solution

someguynamedmatt

Distinguished
It's a good idea, but just isn't possible due to the power delivery requirements of a chip that dissipates so much heat, along with the fact that no desktop memory in existence will come anywhere close to the bandwidth of direct access to the GDDR5 present on your video card. Like wolf said, it isn't just a 'chip' we're dealing with, it's an entire subsystem in your PC, and as such it just isn't practical to try and stuff all of that hardware onto a socket when interfacing over PCIe works just fine.
 

BigDKevin

Reputable
Mar 28, 2015
2
0
4,510
Makes total sense because I'm sure this was thought of by hardware engineers, but I didn't know the reason behind it. I was looking at the insides of a gaming laptop and just realized that the PCIe on them are just an extension kinda like a video game DLC; it could be built in but they just don't do it, didn't know exactly why. Thanks for educating me on this, I was looking everywhere for the answer but couldn't find it.