what happens with the on board graphics on your cpu when you get a gpu like gtx 780?

RushNReady

Honorable
Dec 30, 2013
323
0
10,790
i am geting the i7-477k proccesor and the asus Z87-PRO, which has the Intel® Z87 Express Chipset, which supports 4k res

source
(4K Ultra HD ready
Discover future visual experiences
Following up on popular 1080p full HD, 4K Ultra HD is the next big thing, and you’re ready for the upgrade thanks to integrated graphics that natively support up to 4096 x 2160 via HDMI or DisplayPort. That’s four times the pixel count of 1080p (1920 x 1080), offering incredible visual clarity, detail, and realism.)

it also says i can hook up three monitors of it all on 1080p and all at 60hz ... so is there really any point in buying a stand alone gpu ?

If there is then what does the computer do with this is it not being used . i have looked at benchmarks with just this chip set and it hits about 50 plus fps is there really any point in spending 300 plus on a gpu ?

thanks
 
Solution
Back in the old days the onboard GPU would be disabled if you added a dedicated GPU card, but today there are sooooo many more options than that:

1) By default it would still be disabled if a GPU is present and no monitors are plugged into the port. If using iGPU features such as quicksync then it will power up the iGPU for those purposes, but generally it is going to sit idle.

2) If using software such as Virtu then you can plug the monitor into the onboard graphics slot, but still use the dedicated card for processing heavy workloads. The cool thing about this is that when not playing a game then your big monster GPU can sit idle, using much less power, but it is still there when you need it. The bad thing is that I have heard...

clutchc

Titan
Ambassador
The on-die GPU is fine for 2D desktop work, but it is no gamer at 1080p... let alone 3 monitors. If you want to game, get a discrete card. If you're not a gamer, you can do fine with the on-die GPU (integrated).

If you use a discrete card, the on-die GPU is always at idle.
 
Back in the old days the onboard GPU would be disabled if you added a dedicated GPU card, but today there are sooooo many more options than that:

1) By default it would still be disabled if a GPU is present and no monitors are plugged into the port. If using iGPU features such as quicksync then it will power up the iGPU for those purposes, but generally it is going to sit idle.

2) If using software such as Virtu then you can plug the monitor into the onboard graphics slot, but still use the dedicated card for processing heavy workloads. The cool thing about this is that when not playing a game then your big monster GPU can sit idle, using much less power, but it is still there when you need it. The bad thing is that I have heard mixed reports about stability and driver related issues... as I have not tried it myself I can't really say how well it does or does not work.

3) You can use it as a 2nd GPU. Plug your main displays into the GPU for gaming use and such, but then you can plug extra monitors into the iGPU for extra screen real estate. Note that monitors plugged into the iGPU will be processed by the iGPU and so it will have limited 3D performance, but desktop and video performance should be fine.


Do you need a dedicated GPU?
Well, it all depends on what you are doing to be doing with it. If you are doing video editing, desktop work, browsing the web, and watching videos then there is little to no point to getting a dedicated GPU, especially with the higher end Intel chips with the better iGPU packages.
If you are doing any sort of games, 3D content creation, or running a butt load of monitors then an addon card is required for a good experience. While onboard graphics have gotten much better in recent years, they are still no replacement for heavy 3D workloads. You can play many smaller games without issue at 1080p with low to medium settings, and some older games will run fine even on high settings. But modern AAA games with high settings can even bring large multiple GPU rigs to their knees at 1080p (to say nothing of higher resolutions), so obviously on-board graphics would not be able to hack these kinds of workloads.
 
Solution

MC_K7

Distinguished
The onboard GPU will start to melt and self-destruct when a discrete GPU is detected. :p

But seriously, what CaedenV said.

Personnally, I find simpler to have the onboard deactivated and everything running on the discrete card. Even a very powerful card like the GTX-780 doesn't consume a lot of energy when on idle or for 2D applications.
 

clutchc

Titan
Ambassador
Intel insists on putting an on-die GPU on their 'enthusiasts' unlocked CPUs. CPUs that the user will almost always be adding a discrete card with.

(Some MBs will allow the GPU to remain active when a discrete card is installed, allowing the integrated ports to be used for a 2nd monitor.)
 


No kidding. Lets axe that useless iGPU and add in more or larger cores! Putting a decent iGPU on an i3 would simply dominate the home computer market... but nooooooo that would blow their price scheme out of whack and cause customer confusion.
 

clutchc

Titan
Ambassador


Lol... I guess you have to expect that from a company that changes sockets every time they get a new idea. At least they still are using sockets... For now.
 
The BGA sillyness will only be for a year, and they will only offer a 2-5% performance increase over the refresh chips coming out in a few months anyways. After that though we get to look forward to Skylake which looks to be Intel's love letter to enthusiasts everywhere and finally a focus on performance instead of wattage for the first time since sandy bridge.
 

MC_K7

Distinguished


I don't see how it's a waste... Your GTX-780 will be like 100 times faster than onboard graphics. Why in the hell would you want to keep using an integrated chip when you got one of the most powerful card around?

The real waste is as clutchc mentioned, the space used for integrated graphics on the chip reduces the area available for the CPU. If Intel would totally remove integrated graphics on their unlocked CPU (K series) it would give more room to overclock. As he said enthusiasts purchasing these chips will always get a discrete graphics card anyways.

 

RushNReady

Honorable
Dec 30, 2013
323
0
10,790


you just explained how its a waste lol, its a waste of money and space if we just going to get a gpu , i will be doing video editing too will it still be best t use the gpu. im still deciding on 780 or 760 sli lol
 

clutchc

Titan
Ambassador
If you're not a gamer, then try the on-die GPU for awhile. It will be more than enough for video work. Later, if you find something it isn't handling as well as you think it should, add the discrete card.

If you ARE a gamer, then you have to add a discrete card.
 

RushNReady

Honorable
Dec 30, 2013
323
0
10,790


i will want to be gaming so planning on getting msi tf geforce 760 sli
 


As someone who builds video editing rigs: Don't worry about the GPU. Not many programs use GPU accelleration, and those that do typically only use it for very specific effects, and more often than not your HDDs will bottleneck the performance of the GPU rendering anyways.

By all means, get a GPU for gaming, or better multi-monitor support, but when it comes to editing it is all about HDD throughput and CPU cores.