Will a lower end GPU be of any benefit over my i7 integrated HD 4000?

philmar

Distinguished
Jun 12, 2008
23
0
18,510
APPROXIMATE PURCHASE DATE: today
BUDGET RANGE: USD $150-200

USAGE FROM MOST TO LEAST IMPORTANT: Photo-editting only with Lightroom 5.7 Absolutely no 3D or gaming

CPU: Intel Core i7-3770K 3.5GHz
CPU Cooler: Cooler Master Hyper 212 EVO
Motherboard: Asus P8Z77-V PRO/THUNDERBOLT
Memory: G.Skill Ripjaws X Series 32GB
Storage: Samsung 850 EVO-Series 250GB
Storage: Samsung 840 Pro Series 256GB
Storage: Seagate Barracuda 3TB 3.5" 7200RPM
Storage: Western Digital Red Pro 6TB 3.5" 7200RPM
Power Supply: Antec High Current Gamer 620W 80+ Bronze

I recently bought a BenQ SW2700PT QHD 2560x1440 monitor and run a 2 monitor set up along with a 20inch NEC 20WMGX2 1680 x 1050. I do NOT game. I use it solely for photo editing. No 3D requirements. I use predominately Lightroom 5.7 and Photoshop CS4. NEITHER programs make use of the GPU acceleration potential so I have NO interest or need of an expensive GPU as it will not aid my editing. Photo editing is a CPU intensive activity. I find that Lightroom slows down considerably on occasion now that, in addition to the pixel moving calculations it does in Lightroom, it (I assume) is now assigned the task of mapping them on a larger and higher resolution monitor. I am wondering if a cheap GPU would represent a good value upgrade to my system. Would a discrete GPU take any load off of the onboard Intel HD 4000 graphics (which I assume is now working harder to update my monitors) and free up the CPU to do more photo pixel pushing and process my photo edits faster? Note my Z77 mobo has a feature called LucidLogix Virtu MVP which claims to integrate well with a discrete GPU. My build is 3 years old and as I plan to do a complete new rebuild in another 2 years, I would only like to do a small incremental upgrade now (max $200). Would a low end GPU help my system better drive my 2 monitors and free up CPU processing power for photo editing....and would it have both a displayport and DVI outlet for 2 monitor support. Sincerest thanks for all help!
 
Solution

atljsf

Honorable
BANNED
get a 1050, it will work for games, as in we all game in one time or another

also the cuda cores in it will help on most editing apps

about the lightroom effects, on most apps that calculation is done by the cpu, if by any chance it is all the contrary, the gpu mentioned will fit the bill and work well with it, alot better than the integrated pu
 
Basically, no. There won't be any performance increase for Lightroom 5.7. GPU acceleration was added as of Lightroom 6, and that would probably result in a speed increase for specific functions (though slower in other) even on Integrated graphics. There's a more in depth explanation from Adobe here:

https://forums.adobe.com/thread/1828580

But the load (or lack thereof) on the HD 4000 won't have any effect on CPU processing.
 

Hello man

Honorable


Uh- the short answer is absolutely. You could probably get a GTX 680 for example, and older used card for about $140 in great shape. I have a GTX 670 GC V2 from Galaxy in one of my rigs and let me tell you- the 3+ teraflop floating point performance would decimate that HD Graphics 4000. I would know-I have a older 2013 retina MacBook Pro with HD Grapics 4000.

However, you could get a GTX 1060, a current generation card for $200, maybe a little less on sale. It hits about 4 TFLOPs as far as I can see on google, though what benchmark is used really does make a difference.

It should make a difference as far as base load. You can actually enable some GPU accelerated features in CS4 according to Adobe. Lightroom apparently has some too, though I can't seem to deduce what version that starts at.

Long and short of it is that it probably makes sense to a certain degree to get something that simply works and fits the bill. This leaves a lot of options open, but I wouldn't go further back than the 6 series of GPUs from Nvidia personally.
 
Solution

philmar

Distinguished
Jun 12, 2008
23
0
18,510


Thanks!
I'm aware my LR v5.7 won't use the GPU to help render the photos faster.
I've recently noticed that the calcs on occasion slow down now that the integrated GPU is driving a higher resolution monitor that I've recently added. I'm wondering if a GPU will take some of that load of the CPU by helping to drive the monitor and therefore free up the CPU to do more of the pixel moving calculations involved in 2D photo manipulation.

 

philmar

Distinguished
Jun 12, 2008
23
0
18,510


Thanks - I'm a photo enthusiast, not a pc guy.
https://www.flickr.com/photos/phil_marion/albums
I'm required to do my craft on computers but have little understanding on how the pc component parts work/play together.
I'm just curious if a low end GPU will pick up some of what i assume is the increased load on my integrated HD4000 graphics and that this would translate in more power for the CPU to use in the CPU intensive mathematical calc needed to do photo processing.
Would there be much difference between a GTX 680 and a GTX 1060 for me, for 2D photo processing? I know the more recent one is clearly a better card but for photo processing I've been just as well served by my HD4000 - photo processing is 2D number crunching that doesn't require a GPU. I just wonder if a cheap GPU will help run my new QHD monitor (the source of my new slow downs), thus freeing up the CPU.
 

It really shouldn't slow it down at all. With the exception of RAM (which you have plenty of) IGP's have their own resources. If the IGP isn't in use, those resources would be sitting idle, and not helping the CPU.

However, I did see this about screen resolution and previews:

"The larger the monitor you use (and the higher resolution), the more work Lightroom does to calculate previews and update pixels when you make adjustments. If you experience performance slowdowns with large monitors, try reducing resolution of the display using the Display Control Panel (Windows) or Displays System Preferences (Mac OS)."

I'm not really familiar with Lightroom, but this sounds a lot like what you're experiencing. Which would make sense, if before it was generating a 1050p preview and now it's making a 1440p preview. Here's where I saw that btw:

https://helpx.adobe.com/lightroom/kb/optimize-performance-lightroom.html
 

Hello man

Honorable


Generating previews does take some time, but storage speed makes the biggest difference. I recently moved from some older drives to 2 3TB drives in a RAID 1 for redundancy to store all my photos/video. WAY faster load times.

However, this really shouldn't be an issue for you. You have some very nice storage drives.

I am not 100% sure that a GPU would help speed up overall workflow in anything but Photoshop. I do notice it a there, and as a user of Premier Pro on occasion I REALLY notice it there.

Lightroom has always seemed a little sluggish to me. You have a lot of RAM and good storage drives which is about all you can do to improve this.

I would recommend at least in the future getting a dedicated GPU, considering many programs are slowly moving towards utilizing GPU's more (for some reason a Graphics Processing Unit is good at rendering visual graphics, what do you know?!).
 

philmar

Distinguished
Jun 12, 2008
23
0
18,510


Definitely when LR is coded to better utilize GPU acceleration I will invest in a quality GPU - that should happen at my next rebuild cycle in 2 years time.
Maybe I'll unplug the 2nd monitor to see if that helps....then I may learn about overclocking. After all I did buy decent cooling and a oc-able CPU. With 2 years to go on this build maybe that is in the cards.
 

Hello man

Honorable


I'm sure you could get a safe overclock up to 4.0ghz. It isn't that hard to do, considering intel turbo boost already goes from 3.5 to 3.9 when under significant load.

The other option is to get a faster processor when you get a new build. You'll need a new motherboard for this unfortunately, but faster options do exist considering the 3770k is from 2012.
 

philmar

Distinguished
Jun 12, 2008
23
0
18,510


Thanks. I'd read that before I bought the monitor., was expecting slow downs. That is what caused me to add a 2nd SSD to optimize that part of the processing pipeline. I am convinced that situation in the link IS the issue I am experiencing.
I just wonder WHAT exactly is bottlenecked at that moment and whether a cheap GPU will alleviate that situation. It is quite possible that Lightroom, which is not coded to use the GPU, is the problem and that the CPU will remain overwhelmed regardless of the fact it has a discrete GPU. OR maybe there is some kind of memory bandwidth issue that a discrete GPU just MAY lessen.
Was hoping there was a photo enthusiast who was also a pc enthusiast who had the same dilemma and had the experience of dropping in a cheap GPU.
If I had a GPU lying around and knew how to install it I'd be more than happy to just try it and see...but I'm a photographer Jim, not a pc techie!! (Star Trek reference - FYI)
 

Hello man

Honorable


Yeah, unfortunately the only thing I can think of doing is setting it up so you have the photos you are preparing on the SSD and then move them to the storage drives afterwards. I know this will mess with the Lightroom catalog a bit though.

I can't really tell you what Lightroom without a GPU is like because I've never not had one in a computer that doesn't have super fast storage where I assume it would be noticeable. My laptop has HD graphics 4000 but my desktop certainly doesn't and I do most of my editing there.
 


I'm sorry about all the confusion, I should have explained more. The GPU isn't doing any of the processing of the images. That means it has no effect on CPU load. It's just displaying what your CPU calculates. And the CPU would be doing the same job whether you use the IGP or a GPU, it would have to calculate the same number of pixels either way. As far as I can tell, the amount of RAM matters a lot, the bandwidth not really, and you're already using SSD's. So changing the GPU won't affect the speed of Lightroom (5.7 at least) at all. It's solely down to processor speed because it's calculating a larger image with more pixels.

Now, you do have an unlocked processor, a good cooler, and a mobo that supports overclocking. That would definitely have a noticeable effect on processing time, and cost nothing but a little more electricity. There are plenty of guide here on how to do it safely.
 

philmar

Distinguished
Jun 12, 2008
23
0
18,510


Thanks to everyone for their help!!

NB > I've decided to buy a Radeon RX 460