Advice Plz: E6700 or E6600 and new monitor?

guinness74

Distinguished
Jan 12, 2007
27
0
18,530
I'm about to buy a new PC with a $3000 budget. As of now, I plan on using my current 2.1 Klipsch speakers as well as my existing LCD monitor

Samsung Syncmaster 730B - 17", 1280x1084 reso, 600:1 contrast, 8 ms response time

With the specs I have mapped out (below), would it make more sense to upgrade my processor to E6700 or keep the E6600 and upgrade my monitor to this:

Samsung Syncmaster 204BW - 20", 1680 x 1050 reso, 700:1 contrast, 6ms response time

OR is there another monitor option that I should explore

Lastly, I think my PSU is overkill, thoughts?

Thanks in advance!

here are the specs that I am planning out:

Case: Gigabyte ATX full tower (black)
Cooling: Gigabyte GH-WIU02 3D Galaxy II Liquid Cooling
OS: MS XP Professional (w/ Vista upgrade coupon)
PSU: Thermaltake W0131RU 850W RT (ATX12V/ EPS12V 100 - 240 V CE)
Mobo: NVIDIA nForce 680i SLI ATX Intel Motherboard
Processor: Intel Core 2 Duo E6600 1066MHz FSB 4M shared L2 Cache LGA 775
RAM: CORSAIR XMS2 2GB (2 x 1GB) 240-Pin DDR2 SDRAM DDR2 800 (PC2 6400)
GPU: NVIDIA GeForce 8800GTX 768MB 384-bit GDDR3 PCI Express x16 HDCP
Sound Card: SNDCD CREATIVE|70SB073A00000 RT
HD: 2X Seagate Barracuda 320GB 7200RPM 16MB Cache SATA 3.0Gb/s
SONY Black 16X IDE DVD-ROM Drive
SAMSUNG 18 X SuperMulti Dual Layer DVD Burner with LightScribe
 
I would go with the E6600 and a LCD monitor. The E6700 will not increase your gaming performance much, but a larger LCD monitor will let you enjoy gaming more.

Your PSU is overkill, a good quality 600w PSU is more than enough for you unless you intend on sticking another 8800GTX in your system.
 

weskurtz81

Distinguished
Apr 13, 2006
1,557
0
19,780
Actually a good quality 600W with enough amperage on the 12v rail(s) would be plenty for an 8800GX sli system.

CPU, yeah, I would go with the 6600 or the 6400....

And I would spend money on a larger monitor, since the smaller monitor would not use the 8800GTX as much as a larger would.

wes
 
G

Guest

Guest
Definitely the CPU and monitor option. I myself just bought a new MW22U 22" from X2Gen for 299.00 less 50.00 rebate.

Specs are

* 1000:1 Contrast Ratio
* 500 cd/m2 brightness
* 1680 x 1050 resolution
* 5 ms response time
* Non-glare glass surface
* Built-in integrated speakers
 

brshelton

Distinguished
Jan 13, 2007
46
0
18,530
I would say go with the 6700 over the 6600. Tom's Hardware did an article on the 8800 GTX and how it required the fastest cpu possible to fully take advantage of it. With the monitor if you have decided on it over the processro go with the Dell 2007WFP it is the best 20" widescreen out there. I am typing on it right now. I would reccomend going with the monitor now though and upgrading your cpu in the future to a a faster one to eliminate the cpu graphics card bottleneck.
 

weskurtz81

Distinguished
Apr 13, 2006
1,557
0
19,780
brshelton,

That is a valid point, and while it does hold water, the difference between the 6600 and th 6700 will be so marginal that you won't notice the difference. And, you could always just overclock it by 300-400 Mhz(which a monkey could do), and not worry about it again.

Also, rumor has it, with DX10 the cpu bottleneck will be pushed back onto the gpu, so, it might be short lived.

Like I said, you do have a good point, but the difference would be so small that the user would not be able to see it.

wes
 

donnagual

Distinguished
Dec 7, 2005
96
0
18,630
I'll add another vote to the idea of upgrading your monitor. I have a dell 24" widescreen (the rest of my specs are similar to OP) and the difference is amazing.
 

IcY18

Distinguished
May 1, 2006
1,277
0
19,280
Although i currently have a 20.1" widescreen BenQ right now running at a resolution of 1680x1050 i would not recommend getting that. i would recommend a monitor that support 1600x1200. It is supported way more than wide screen in gaming so it basically nulls the fact that i have a widescreen monitor when i play bf2 or bf2142 for instance.

although when it is supported like it hl2 and Company of heroes it does look good.

The downside to a monitor with a resolution of 1600x1200 is that they are usually way more expensive.
 

weskurtz81

Distinguished
Apr 13, 2006
1,557
0
19,780
I have the Dell 2001 20.1 lcd. It's a great LCD.... I love it.... I would not trade it for the widescreens. But, IMO, anything is better than the current LCD you have.

wes
 

Granite3

Distinguished
Aug 17, 2006
526
0
18,980
Go here for the widescreen gaming portal.

Click on the games section for the hacks to turn most games into widescreen. Easy and wonderful to behold, as the games look more like movies than the old 4:3 box.

OP- stick to the 6600, OC 5% and you are better than the 6700, and have more than enough money difference to get a 22" lcd for ~$300.

Difference in it vs the 17" lcd? Priceless!
 

bydesign

Distinguished
Nov 2, 2006
724
0
18,980
With a GTX a 600W PSU is the safe choice. No way would any PSU last long with 600W In SLI. Nvidea recommend at a minimum 450W for a GTX with a single card.

Go with the E6600 all them should OC to 3Ghz with no issues on stock cooling.
 

IcY18

Distinguished
May 1, 2006
1,277
0
19,280
Thanks for the link but i've already used what they offer and all that happens is a stretched hud...not anywhere near true widescreen. i've played games in widescreen and they look fantastic but games like bf2 and 2142 do not support it and when you force it like ws gaming portal says to do it just stretches everything...
 

WizardOZ

Distinguished
Sep 23, 2006
250
0
18,780
Assuming that you don't actually have all the parts listed. If you do well, you may want to look into a return.

Your video card choice is very questionable.

For starters, it is a DX-10 card, and the only one out there. So you are paying a hefty premium for that alone, because there is no competition for the product. Why are you (and so many others) so willing and eager to pay extra for a producrt that absolutely requires the ultimate top-end C2D CPU to get not even full performance.

Since DX-10 isn't out and won't be for several more months, and there are no games out that use DX-10 (surprise surprise surprise) there is no way the card will perform to full designed capabilities for at least 6 months. And I know that it is the latest and greatest, but unless you are willing and able to get the top end C2D CPU, you ain't gonna see any real advantage over other cards.

You would be wiser to ditch the 8800 and gewt something like the ATI X1950X? or the nVidea equivalent for substantially less money and invest the cash you save into a more powerful CPU or better PSU or whatever other components you wish to add.

When DX-10 arrives, and ATI brings its product to market, you will be able to save a fair chunk of cash even if you do go with the 8800. It may even be that the price on a single card will drop enough that you will be able to justify getting two of the suckers. SLI anyone? On top of which the ATI card might even out-perform the nVidea chip.

Getting the best is one thing, but wasting cash is another. But, it is your cash, and you can do what you want with it. I just think that you could do a better job of resource allocation and strategic planning. You can always sell the "lesser" card you get now later and recover at least 70% of your expenditure. Worst case is that you have a pretty good spare card on hand.
 

IcY18

Distinguished
May 1, 2006
1,277
0
19,280
considering that the 8800 series delivers undeniable graphic supremacy i think its price is fairly justified. but it is the high end and regardless of its high end performance or its performance/$ some ppl just can't fork over that type of money.

When ati finally gets its product to the shelves yes prices might be drop on the 8800 but once again something most likely more expensive and more powerful will be out...
 

brshelton

Distinguished
Jan 13, 2007
46
0
18,530
I love my widescreen and with www.widescreengamingforum.com the issue of not being able to game in widescreen is irrelavant. It does not stretch it what it does is adds an option or forces the program to run at that resolution. I am running 2142 at 1680x1050 and it is not stretched it is forced to run at that resolution by a line added to the shortcut. I would go with a Dell monitor either the 20 inch or the 24. Once you get the 24 you are getting 1080p resolution and you also have a composite video input for hooking up things such as an Xbox 360.
 

IcY18

Distinguished
May 1, 2006
1,277
0
19,280
in BF2/142 it stretches this is easily noticable by the minimap in the upper right hand corner. When at a standard resolution the minimap is a perfect circle. When a resolution is forced on the bf2/142 the minimap is an oval therefore proving it stretches it. i've tried both forcing 1680x1050 and running the only other option supported by both my monitor and game which is something like 1400x1050 and the 1400x1050 looks much better..