Nvidia Optimus , Dedicated vs Integrated , Which one creates more heat ?

Solution
Doesn't matter. It's not a choice between using the Intel graphics or Nvidia GPU. It's a choice between using Intel graphics, or Intel graphics + Nvidia GPU. So using the Nvidia GPU always uses more power and thus generates more heat. It'll generate considerably less heat running a browser or IDE than it would a game. But it'll still be more than if you ran off the Intel graphics alone.
In an Optimus laptop, the Intel integrated graphics is always on. The way Optimus works is that the Nvidia GPU acts as a co-processor. If a game uses it, the Nvidia GPU generates a frame, then passes that completed frame to the Intel integrated graphics, which displays it on the screen. (Essentially, vsync is always on.)

So if you're trying to reduce heat or extend battery life with an Optimus laptop, using the Intel integrated graphics instead of Nvidia will always reduce power consumption. Any time the Nvidia GPU is used, you're drawing extra power from the battery.

There's a small gotcha though. In most gaming laptops, the CPU runs a lot hotter than the GPU. CPU temps of 80-95 C are not uncommon, while GPU temps may remain at 70C or below. So even though using the Nvidia GPU will burn more power (overall amount of heat generated by the laptop is higher), if the laptop has separate heat pipes for the CPU and GPU, then running a game off the Intel Integrated graphics may actually make the CPU run hotter than if you were using the Nvidia GPU. (Heat generated by the CPU may be higher using Intel graphics.)
 
Doesn't matter. It's not a choice between using the Intel graphics or Nvidia GPU. It's a choice between using Intel graphics, or Intel graphics + Nvidia GPU. So using the Nvidia GPU always uses more power and thus generates more heat. It'll generate considerably less heat running a browser or IDE than it would a game. But it'll still be more than if you ran off the Intel graphics alone.
 
Solution