Need Recommendation for Graphics Card for Photo Editing

ExquisitlyBad

Distinguished
May 28, 2008
8
0
18,510
I want to upgrade my video card and monitor. The main use is for photo editing using Nikon Capture NX and Photoshop CS. I also want to install two LCD monitors.

Can anyone recommend a video card for my application? Also, will I need two cards in order to support two monitors?

Grateful for your input.
 

radnor

Distinguished
Apr 9, 2008
1,021
0
19,290
Check CUDA from Nvidia.
Check CUDA compatible GPUs

Pick one !! Most of them already got 2 DVI outputs by now.

And ya should be fine :)
 

cleeve

Illustrious
For Photo editing? Are you for real?

For photo editing, pick a cheapo Radeon 2400 or Geforce 8400... or basically, the cheapest card you can find with dual outputs. Might have to step up to a 2600 PRO or 8500 GT.

Photo editing requires absolutely no 3d power whatsoever.

CUDA? Why the hell would he need that for photo editing? Good lord.
 

runswindows95

Distinguished
<runs Photoshop CS2 on a low 7300LE with no issues

Edit: Photoshop always has been CPU heavy, not GPU heavy. At most, the highest you need is a 8600GT or an HD3650 because these cards always have dual DVI.
 

I think hes refering to the cs4 adobe coming out in Oct? Itll show gains in multi selections(more than 1 instance) even in photo editing using CUDA
 
Yeah there are limited applications of GPU assistance for photoshop and other 2D editing apps, but really they are still very early, and very limited. Not everyone will benefit greatly from GPGPU acceleration even when it finally does come.

And it's not just CUDA based either

While there is GPGPU assistance planned for the future of photoshop, IMO, don't worry about it, get a small mid-range DX10 card from either IHV and by the time it arrives, even they will give you a little boost.

But the big focus for this stuff will be extremely large images and video. video editing is where I would start planning ahead since it's already being implemented whereas Photoshop is still a 'future direction of our app project'.

I would say without GPGPU get the cheapest 10bit supporting passively cooled card you can, with GPGPU in mind I'd say get the cheapest GF8400 / HD2400 you can that's passivley cooled.

And if you're super over-analyzing the purchae, and looking way ahead, then consider that the colour support may change in the next generation in order to support some aspects of HDMI 1.3 'Deep Colour' as mentioned in early releases. However for most people, even if they add it, they'll never notice it or the lack of it.
 

cleeve

Illustrious
Well, if CS4 is uding CUDA, my apologies. I'd heard it was 3d accelerated but I didn't know it was based on Nvidia's CUDA.

I can't find dig up a confirmation of that on the web, are you sure it's based on Nvidia's proprietary technology and not just a generic implementation? Seems strange for Adobe to pick sides in the graphics war...
 
Yeah I'm not sure it's CUDA specific at this point since Adobe themselves were almost unspecific on purpose to avoid ruling stuff out, it seems to me the CUDA talk came from nV's adoption of this announcement in the nV Days conference thingy recently.

I'd be surprised if it was nV specific, but you never know, it could be part of their new TWIMTBP program except now it means - The Way It's Meant To Be PhotoChopped. [:mousemonkey:1]
 

The_Abyss

Distinguished
Mar 24, 2006
1,333
0
19,310
The guy is photo editing. Unless he's using a pro camera, the largest images he'll be likely to encounter will be 50mb RAW files, not the 2GB monster image displayed at the CS4 demo.

A decent graphics card will suffice in this instance, not a GT200 leviathon.
 
Exactly, it 'could' help for applying filters and effects on smaller images and possibly resizing etc, but we're talking about probably a diffference of like 1-2 seconds versus 3-5 to apply something. Sure it'd be a nice bonus, but not something I'd buy for right now until they expand further on exactly what it will do (and even if it will be offered on the single precision GF8/9 series).

Not enough details to make it any more important for this than DX10.1 is for gaming right now.