Sign in with
Sign up | Sign in
Your question
Solved

what happens with the on board graphics on your cpu when you get a gpu like gtx 780?

Tags:
  • Graphics
Last response: in Graphics & Displays
Share
January 17, 2014 11:48:01 AM

i am geting the i7-477k proccesor and the asus Z87-PRO, which has the Intel® Z87 Express Chipset, which supports 4k res

source
(4K Ultra HD ready
Discover future visual experiences
Following up on popular 1080p full HD, 4K Ultra HD is the next big thing, and you’re ready for the upgrade thanks to integrated graphics that natively support up to 4096 x 2160 via HDMI or DisplayPort. That’s four times the pixel count of 1080p (1920 x 1080), offering incredible visual clarity, detail, and realism.)

it also says i can hook up three monitors of it all on 1080p and all at 60hz ... so is there really any point in buying a stand alone gpu ?

If there is then what does the computer do with this is it not being used . i have looked at benchmarks with just this chip set and it hits about 50 plus fps is there really any point in spending 300 plus on a gpu ?

thanks

More about : board graphics cpu gpu gtx 780

January 17, 2014 11:51:50 AM

The on-die GPU is fine for 2D desktop work, but it is no gamer at 1080p... let alone 3 monitors. If you want to game, get a discrete card. If you're not a gamer, you can do fine with the on-die GPU (integrated).

If you use a discrete card, the on-die GPU is always at idle.
m
0
l
January 17, 2014 11:54:12 AM

As clutchc said, it will be idle. It will be ready for use should your graphics card hit the dirt, so it's nice to have.
m
0
l
Related resources

Best solution

January 17, 2014 12:04:01 PM

Back in the old days the onboard GPU would be disabled if you added a dedicated GPU card, but today there are sooooo many more options than that:

1) By default it would still be disabled if a GPU is present and no monitors are plugged into the port. If using iGPU features such as quicksync then it will power up the iGPU for those purposes, but generally it is going to sit idle.

2) If using software such as Virtu then you can plug the monitor into the onboard graphics slot, but still use the dedicated card for processing heavy workloads. The cool thing about this is that when not playing a game then your big monster GPU can sit idle, using much less power, but it is still there when you need it. The bad thing is that I have heard mixed reports about stability and driver related issues... as I have not tried it myself I can't really say how well it does or does not work.

3) You can use it as a 2nd GPU. Plug your main displays into the GPU for gaming use and such, but then you can plug extra monitors into the iGPU for extra screen real estate. Note that monitors plugged into the iGPU will be processed by the iGPU and so it will have limited 3D performance, but desktop and video performance should be fine.


Do you need a dedicated GPU?
Well, it all depends on what you are doing to be doing with it. If you are doing video editing, desktop work, browsing the web, and watching videos then there is little to no point to getting a dedicated GPU, especially with the higher end Intel chips with the better iGPU packages.
If you are doing any sort of games, 3D content creation, or running a butt load of monitors then an addon card is required for a good experience. While onboard graphics have gotten much better in recent years, they are still no replacement for heavy 3D workloads. You can play many smaller games without issue at 1080p with low to medium settings, and some older games will run fine even on high settings. But modern AAA games with high settings can even bring large multiple GPU rigs to their knees at 1080p (to say nothing of higher resolutions), so obviously on-board graphics would not be able to hack these kinds of workloads.
Share
January 17, 2014 12:10:55 PM

The onboard GPU will start to melt and self-destruct when a discrete GPU is detected. :p 

But seriously, what CaedenV said.

Personnally, I find simpler to have the onboard deactivated and everything running on the discrete card. Even a very powerful card like the GTX-780 doesn't consume a lot of energy when on idle or for 2D applications.
m
0
l
January 17, 2014 12:30:10 PM

just seems a waste ? sureley they would do something useful with it ?
m
0
l
January 17, 2014 12:37:38 PM

Intel insists on putting an on-die GPU on their 'enthusiasts' unlocked CPUs. CPUs that the user will almost always be adding a discrete card with.

(Some MBs will allow the GPU to remain active when a discrete card is installed, allowing the integrated ports to be used for a 2nd monitor.)
m
0
l
January 17, 2014 12:55:56 PM

clutchc said:
Intel insists on putting an on-die GPU on their 'enthusiasts' unlocked CPUs. CPUs that the user will almost always be adding a discrete card with.

(Some MBs will allow the GPU to remain active when a discrete card is installed, allowing the integrated ports to be used for a 2nd monitor.)


No kidding. Lets axe that useless iGPU and add in more or larger cores! Putting a decent iGPU on an i3 would simply dominate the home computer market... but nooooooo that would blow their price scheme out of whack and cause customer confusion.
m
0
l
January 17, 2014 2:53:47 PM

CaedenV said:
clutchc said:
Intel insists on putting an on-die GPU on their 'enthusiasts' unlocked CPUs. CPUs that the user will almost always be adding a discrete card with.

(Some MBs will allow the GPU to remain active when a discrete card is installed, allowing the integrated ports to be used for a 2nd monitor.)


No kidding. Lets axe that useless iGPU and add in more or larger cores! Putting a decent iGPU on an i3 would simply dominate the home computer market... but nooooooo that would blow their price scheme out of whack and cause customer confusion.


Lol... I guess you have to expect that from a company that changes sockets every time they get a new idea. At least they still are using sockets... For now.
m
0
l
January 17, 2014 3:00:08 PM

The BGA sillyness will only be for a year, and they will only offer a 2-5% performance increase over the refresh chips coming out in a few months anyways. After that though we get to look forward to Skylake which looks to be Intel's love letter to enthusiasts everywhere and finally a focus on performance instead of wattage for the first time since sandy bridge.
m
0
l
January 17, 2014 4:07:00 PM

RushNReady said:
just seems a waste ? sureley they would do something useful with it ?


I don't see how it's a waste... Your GTX-780 will be like 100 times faster than onboard graphics. Why in the hell would you want to keep using an integrated chip when you got one of the most powerful card around?

The real waste is as clutchc mentioned, the space used for integrated graphics on the chip reduces the area available for the CPU. If Intel would totally remove integrated graphics on their unlocked CPU (K series) it would give more room to overclock. As he said enthusiasts purchasing these chips will always get a discrete graphics card anyways.

m
0
l
January 17, 2014 6:16:48 PM

MC_K7 said:
RushNReady said:
just seems a waste ? surely they would do something useful with it ?


I don't see how it's a waste...



you just explained how its a waste lol, its a waste of money and space if we just going to get a gpu , i will be doing video editing too will it still be best t use the gpu. im still deciding on 780 or 760 sli lol
m
0
l
January 17, 2014 6:31:14 PM

I thought you meant it's a waste that you cannot use the integrated graphics. And I meant it's a waste of space on the CPU chip. So nevermind if you meant the same thing. ;) 
m
0
l
January 17, 2014 7:19:20 PM

If you're not a gamer, then try the on-die GPU for awhile. It will be more than enough for video work. Later, if you find something it isn't handling as well as you think it should, add the discrete card.

If you ARE a gamer, then you have to add a discrete card.
m
0
l
January 18, 2014 12:55:31 AM

clutchc said:
If you're not a gamer, then try the on-die GPU for awhile. It will be more than enough for video work. Later, if you find something it isn't handling as well as you think it should, add the discrete card.

If you ARE a gamer, then you have to add a discrete card.


i will want to be gaming so planning on getting msi tf geforce 760 sli
m
0
l
January 18, 2014 6:50:27 PM

RushNReady said:
MC_K7 said:
RushNReady said:
just seems a waste ? surely they would do something useful with it ?


I don't see how it's a waste...



you just explained how its a waste lol, its a waste of money and space if we just going to get a gpu , i will be doing video editing too will it still be best t use the gpu. im still deciding on 780 or 760 sli lol


As someone who builds video editing rigs: Don't worry about the GPU. Not many programs use GPU accelleration, and those that do typically only use it for very specific effects, and more often than not your HDDs will bottleneck the performance of the GPU rendering anyways.

By all means, get a GPU for gaming, or better multi-monitor support, but when it comes to editing it is all about HDD throughput and CPU cores.
m
0
l
January 18, 2014 7:15:33 PM

I will have a ssd for editing video will put all my raw files there and render to it. This will help won't it
m
0
l
January 19, 2014 11:16:53 AM

yep, SSD will put the bottleneck to the CPU where it belongs
m
0
l
January 19, 2014 11:43:13 AM

i7 4770k should not bottle neck lol
m
0
l
!