Will Integrated for monitor and Dedicated for rendering improve performance?

RMDEQ1

Commendable
Mar 4, 2016
7
0
1,510
I have Intel HD (5000?) graphics on board (integrated) and a GeForce GTX 980Ti (dedicated). The integrated graphics could handle all the games I play just fine.

When I use the system for renders my GPU utilization is always at 99%. How much of the dedicated GPU is being taken away from renders to handle the display/monitor tasks? I'm not playing games or videos, just usually have a few browser windows open or such.

I haven't benchmarked a render both ways which I guess would answer the question but I was hoping someone had this experience before. What provoked me to ask this question is that while doing a render, the NVidia GPU utilization window gets a little corrupted and the app I use for renders gets corrupted a little and needs a minimize/resize to get back to normal. Minor annoyances, but it would be nice to avoid them.
 
Solution


Your rendering program should be able to use the GPU even if it is not the main graphics adapter so you should be just fine switching to integrated.
 
Solution