Sign in with
Sign up | Sign in
Your question
Solved

Is IB i7's integrated graphics enough for this?

Last response: in Graphics & Displays
Share
June 15, 2012 8:23:38 PM

I'll be getting a new computer for my WORK. In the past, I've done a dedicated GPU just so I can have smooth dual-monitor support and uncompromising Windows and desktop application performance (no gaming). But with recent improvements in Sandy Bridge and Ivy Bridge CPUs, I've been debating whether or not I can get away with something like an i7 CPU with HD4000 or whether I'd still need a dedicated GPU. Here's what I need to be able to do pretty much every day:

- Dual-monitors at native 1680 x 1050 resolution (two 22 inch monitors)
- Smooth Windows 7 64bit Aero effects (transparent windows, mouse-over preview at taskbar, etc.) and any other Windows 7 things
- MS Office 2010 apps along with IE, FireFox, Chrome, and other "regular" software like that
- Webcam over Skype or similar, using something like a Logitech USB webcam (up to 720p video quality)
- Various software development tools, such as Visual Studio 2010, SQL Server Mgmt Studio, etc.
- Adobe Photoshop for minor photo or graphics work, mostly for the web-design needs (NOT heavy photo processing like in multimedia design shops, but rather light and occasional)
- Audio-editing tools such Adobe Audition (occasionally) for some light editing and mixing
- Video format converting (occasionally, too), and MAYBE things like Adobe Premier Elements, from time to time
- Virtual Machines in VirtualBox

I know I need to pay attention to CPU, RAM, HDD/SSD and other stuff, but I'm really asking about the GPU (dedicated vs. integrated) here.

We usually go with Dell for work, so i might look at Dell Optiplex or Precision series. I see that they offer i7-3770, for example, as well as various Xeon E3 models (which might be an option too). in most cases, adding a 1GB dedicated nVidia or AMD card only costs $100-$200 extra, so the money is not really an issue here. It's more of "which is a smarter thing to do". I like the idea of having a smaller "footprint" (one less component, which would mean less power consumption and possibly less heat and noise) if I go with only integrated GPU. But I cannot compromise productivity and I don't want any annoying jerky things going on on-screen (it has to be smooth).

I know that dual monitors are supported with IB, but what I don't know is whether I would start to see issues with any of the applications listed above.

What do you guys think? At what point do you say, "you might be better off with dedicated GPU?"

More about : integrated graphics

a b U Graphics card
June 15, 2012 8:50:13 PM

+1 fpoon,

Although I recommend any GPU over integrated because it will help for any task,

You can get a cheap 40-50 $ and will still outperform the integrated + you will be able to dual monitor
m
0
l
Related resources
a b U Graphics card
June 15, 2012 8:55:25 PM

Yes, Intel HD4000 supports dual monitors. You could probably get away with it, byt a cheap graphics card is recommended.
m
0
l
June 15, 2012 8:57:34 PM

You guys sure about no dual-monitor support? Here's one of the Dell choices:

http://www.dell.com/us/soho/p/optiplex-9010/pd?oc=spctz...

...and if you look on the 360-degree view, you'll see a VGA + 2 DisplayPort plugs (all on-board... ignoring the dedicated DVI pictured there too). I could see that you could have a single VGA and a single DisplayPort in order to provide two different connection options, but you wouldn't have two identical DisplayPort plugs unless you could support them simultaneously, right?

EDIT: Ah, just saw the previous post... Thanks.
m
0
l
June 15, 2012 9:09:32 PM

the_jiveman said:
What do you guys think? At what point do you say, "you might be better off with dedicated GPU?"

In my IT dept we recently decided to change our standard build to include discrete graphics cards. We wanted to make sure they would stay viable for the three year lease and want to avoid having to retouch desktops all across the country to add a discrete option two years down the road because we skimped now.

We run win7 x64 and our users do most of the same stuff as you, if not less.
Our build is a Optiplex 790MT with 512MB AMD RADEON HD 6350 (2 DVI).

One special note is to make sure your monitor and video card have the same type of DVI connector (single vs dual link) or you will need an adapter.
m
0
l
June 15, 2012 9:16:37 PM

Thanks for the advice. Which CPU did you get in your 790's? I see they offer i3 through i7, and that might have some impact on integrated graphics (and i wonder if that influenced your decision to get dedicated, at all).
m
0
l

Best solution

a b U Graphics card
June 15, 2012 9:31:48 PM

Hi Jiveman - I'd choose your CPU first. Then, if it has HD4000 why not just see if it's good enough? Unless it is too much of a hassle for your folks at work to later add a $40 GPU, I'd first see if HD4000 is enough. From what I've seen it's much faster than HD3000. Your graphics processing needs aren't demanding at all. And HD4000 supports triple monitors natively (though the cable/adapter requirements might just make getting a discrete card worthwhile).
Share
June 15, 2012 9:40:48 PM

I agree with larkspur. I think you should first see if integrated graphics is good enough, then add a graphics card if it isn't.

My understanding is that Sandy Bridge integrated graphics supports two monitors and Ivy Bridge supports three. And I don't see any usage on your list that shouldn't work fine with integrated graphics.

By the way, I am an Intel employee but I don't speak for Intel, and I'm not an expert on our graphics.
m
0
l
June 15, 2012 9:43:18 PM

the_jiveman said:
Thanks for the advice. Which CPU did you get in your 790's? I see they offer i3 through i7, and that might have some impact on integrated graphics (and i wonder if that influenced your decision to get dedicated, at all).

We have Core i5 2400s, so they are HD3000. The Ivy Bridge CPUs are not available in our build options yet.

FWIW, if we had the new CPUs to choose from, I would order most PCs without discrete graphics and only add the video card for power users like marketing and some of the more tech savvy admins. Most of the folks in the company are just checking email, browsing gigantic ACT! databases, and doing number crunching with Excel, Argus and other financial programs. Those things need CPU and RAM, not graphics; so I wouldn't waste money on getting them discrete graphics.

Alas, I don't get to make that decision; so, in the interest of simplicity and uniformity, everyone gets graphics cards.
m
0
l
June 16, 2012 4:54:53 AM

larkspur said:
...Unless it is too much of a hassle for your folks at work to later add a $40 GPU, I'd first see if HD4000 is enough...

It wouldn't necessarily be too much hassle, but it typically works better when you get the whole package from somebody like Dell along with 3-yr warranty, so you can work with their support if anything is messed up with hardware (rather than try and figure out if it's a Dell part or an aftermarket part and who to call). But I wouldn't say it's a show stopper. I think this approach is reasonable (now that I know that most people here (so far) seem to think that HD4000 would be more than sufficient for my graphics needs).

daweinah said:
FWIW, if we had the new CPUs to choose from, I would order most PCs without discrete graphics and only add the video card for power users like marketing and some of the more tech savvy admins...


Thanks, it's helpful to know that you would consider integrated if you had ivy bridge as choice. When it comes to "more tech savy admins," I would consider myself to be in that bunch. :-) But I guess I would still not see that as graphics-intensive work (just good CPU/RAM and probably an SSD).

I mean, I'd definitely be treading new ground by going integrated for the first time in years, but it seems that the CPUs have come a long way, and it would be interesting to give it a shot.

I will also mention that among my CPU choices are Xeon E3 models, which are offered in the Dell Precision series desktops. If I go with one of those, and if it doesn't have integrated, then the choice for dedicated is obvious. But I haven't started looking into those yet to see whether I'd really need a Xeon over i7. I know I wouldn't need more than one CPU (which Xeon's support), and wouldn't need ECC ram (which also Xeon's support), so i7 seems like a more appropriate choice.

i don't really mean to turn this into a CPU comparrison thread, but can somebody tell me if any of the Xeon's come with HD4000 GPU or if that's reserved for "i" CPUs?
m
0
l
June 16, 2012 11:12:08 PM

Xeons and the LGA 2011 i7's don't carry Intel HD Graphics; only the LGA 1155 CPUs have them.
m
0
l
June 18, 2012 12:31:45 AM

Best answer selected by the_jiveman.
m
0
l
June 18, 2012 5:07:24 PM

A few of the Xeons do in fact have HD4000. Those would be the Xeon E3 1275, 1245, and 1225. Note that these are all socket LGA 1155 parts. I think Wikipedia's Ivy Bridge page is a good reference on this.

Note that in order to use integrated graphics you need to find a motherboard that supports it---many of the Xeon motherboards do not.

By the way, I believe the "5" at the end of the Xeon part name is a hint that it has integrated graphics. It's not universally true, but most of the recent Xeons that end in 5 have integrated graphics.
m
0
l
!