Questions
Background
I bought a new 2560x1440 monitor and tried to drive it with my ghetto old ATI Radeon 5770. I purposely bought that a couple years ago for its low temperature, not its processing power. To my chagrin, the HDMI output on that card had a maximum output of 1080p. So I went into my bios and switched to "OBGFX", which I think stands for onboard graphics, which I also think means its using my Intel Core i5 6500's integrated graphics. After switching the HDMI cable from the dedicated graphics card to the motherboard, I was finally able to achieve 2560x1440 resolution on the monitor. Since then, I've been able to play 1080p videos on the web and harddrive just fine.
1) Does using the CPU's integrated graphics processor adversely affect regular processing?
2) How much does the integrated graphics processor increase CPU temperature on average?
3) Should I upgrade to a dedicated GPU if #1 and #2 are unfavorable? I mainly watch videos on this PC. I never game on it.
Background
I bought a new 2560x1440 monitor and tried to drive it with my ghetto old ATI Radeon 5770. I purposely bought that a couple years ago for its low temperature, not its processing power. To my chagrin, the HDMI output on that card had a maximum output of 1080p. So I went into my bios and switched to "OBGFX", which I think stands for onboard graphics, which I also think means its using my Intel Core i5 6500's integrated graphics. After switching the HDMI cable from the dedicated graphics card to the motherboard, I was finally able to achieve 2560x1440 resolution on the monitor. Since then, I've been able to play 1080p videos on the web and harddrive just fine.