Well I splurged today, what do you think?

Fionn2003

Distinguished
May 27, 2008
24
0
18,510
I ended up grabbing the BFG 9800 GX2 which should be about 475 for the card which is not bad.

I also picked up the new Antec Twelve Hundred case and the ABS Tagan 700 Watt power supply.

I then finished it off with a Intel 2.4 quad core q6600 processor.

With my already nice Abit board and 4 gigs of Corsair dd2 and Dell 2707 monitor running at 1920 X 1200, I am hoping this should be a pretty peppy system for gaming.



 

Fionn2003

Distinguished
May 27, 2008
24
0
18,510
Well considering Frys and some other retail stores had it for 529 - 599, i thought 475 was the best price to go with heh.
 

spoonboy

Distinguished
Oct 31, 2007
1,053
0
19,280


i think you should have waited a month
 

yadge

Distinguished
Mar 26, 2007
443
0
18,790


I agree.

But still, he'll definitely enjoy what he got, and it should serve him well for quite a while.
 

pauldh

Illustrious
Nice setup. Perfect for 19x12 gaming. Enjoy it.

To best match up with that 9800GX2, you could OC the Q6600. 3.0GHz should be possible even with the stock Intel fan. With proper cooling way beyond that.
 

Fionn2003

Distinguished
May 27, 2008
24
0
18,510
Well I could wait, but then after those cards out, another card is going to come out better... ext ext lol:p Why not just get this one card, and when prices come down maybe for christmas, SLI 2 9800 GX2s:p
 

Fionn2003

Distinguished
May 27, 2008
24
0
18,510
yea never OCed a processor before, I saw a lot of people now using the q6600 or higher in their gaming rigs, and im just using a 1.8 duel core i bought a year ago, so I figure a good cpu upgrade with a better fan could do better as well heh:) Im also excited about that new Antec Twelve Hundred Case that just came out, it is going to be sick looking with that ABS power supply.
 

Quad cores are overrated for gaming rigs. They are number crunchers and really video encoding and apps that are made to take advantage of multiple cores only benefit. 95% of the time a faster dual core will outperform a quadcore for games. What 1.8MHz CPU do you have? just curious.

You are absolutely righth with the cards coming out. There will always be a new card coming out, and they will always have a premium price when they come out and people will pay it then be angry in a few months when its $100+ cheaper. But by the time they wait it out a new card comes out with the same premium. It's a neverending cycle. I bought an 8800GTS/512 for $300 W/ $20 rebate, so $280. I can get the same card for $220. Oh well I've had it several months and am happy. I didn't buy it for $399 when it first came out or whatever it was, I never do that. I'll be happy for a year then sell it for $100-$150 and get another $300 card. That way I only ever pay about half the price of the card and reclaim the rest from the old. I did that with my x1900xtx.
 

blade85

Distinguished
Sep 19, 2006
1,426
0
19,280
lets just say you wont be needing to upgrade any time soon :p

though...would have been nice to wait for a few more weeks to see how the 4000 cards from ati stack up.
 

Fionn2003

Distinguished
May 27, 2008
24
0
18,510

pauldh

Illustrious

Not to nitpick, but I don't agree. People are all too often saying that, but it's not entirely true. Especially that 95% number IMO is way off. I think quads get overhyped by some people and underhyped by others. IMO there is no gaming advantage whatsoever to a 4.0GHz or higher Dual vs a 3.6GHz Quad. Priced the same I'd take the quad every time.

Quads do not need the same clocks to match a dual core even in single and dual threaded games. If we talk stock and fairly low clocks, then that is probably typically / somewhat true depending on actual gaming or low res scaling. (say 2.4Ghz Quad vs 3.0GHz Dual at stock clocks).

But Really, Look how a 3.6GHz Q6600 beats a 3.85GHz e6850 even in single threaded Fear, and the e8400 at 4.2GHz can't hang with a slower clocked 3.6GHz Quad. (edit: Look at all games in second link, not just crysis)
http://www.xbitlabs.com/articles/cpu/display/core2quad-q6600_8.html#sect0
http://www.legionhardware.com/document.php?id=737&p=2

Look at Q6600 vs e8400 here at stock clocks.
http://www.legionhardware.com/document.php?id=735&p=5

IMO people often get misled by low res no eye candy scaling tests vs real gameplay. Firingsquad has a great example in these games as the high clocked e8500 kicks butt at 800x600 no fsaa. But who plays at those settings, and what happens at more typical gaming settings? At typical more GPU demanding settings, the 2.4GHz Q6600 is often a tad above a 4.17GHz e8500 in the games they test.
http://www.firingsquad.com/hardware/intel_core_2_duo_e8500_wolfdale/page5.asp

Legion shows at puny medium details, how little the e8400 really scales at those high clocks:
http://www.legionhardware.com/document.php?id=717&p=7

Honestly I've said it over an over, but I don't see a big advantage to a quad for pure gaming, far from it and IMO either is great and pretty equal for gaming. But I don't agree with the sentiment that Duals are better. Yet repeatedly it comes up over and over like there is a flaw to gaming on a quad unless it's quad threaded they will be slower. That's just not true. Max clock them both and the quad will easily stay with a higher clocked dual. The first two links above show they can pull ahead from duals with higher clocks. And even at stock CPU clocks most games are GPU limited at gaming settings. Only with beastly graphics power like a GX2 or SLI do I really push people to OC the Q6600 for gaming as certain games it does help.
 

Fionn2003

Distinguished
May 27, 2008
24
0
18,510
With the setup I am running, what do you think I could OC my processor to without it going to hot or damaging the processor?
 

pauldh

Illustrious

My Q6600 runs 3.1GHz and doesn't go over 60 degrees load with the stock Intel cooler. I usually just run 3.0GHz for temp reasons. SO depending on your room temps and airflow, 3.0GHz on the stock cooler and quite a bit higher on the freezer.

I too have that Freezer and will eventually put it on as I decide not to strip the system out and mount a bolt through (the mobo) cooler. I've used the freezer before on another rig and it's very nice. Some people hate the push down mount, but at least it's easier than pulling the mobo. With the warm weather coming and increasing room temps, I'm thinking eventually I'll have temp issues on the stock cooler. I'm eager to see what this Q6600 can do as 3.1GHz is a rock at stock voltage and I have not tried higher yet because of temps.
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780
While I think it's nice, I think it's a few weeks too early. Of course, if you think that G280 will be unavailable for the whole summer and fall, then it's a reasonable choice for an Nvidia fan to pick up a 9800gx2 today.

IMHO, when new parts are within a few weeks of release, it doesn't make much sense to buy the old parts (except for low end as a stop gap for a new system, i.e. a 3650 while waiting for the 4870, or an 8800gs while waiting for the GTX260).

Read this with a shaker of salt handy:

http://www.theinquirer.net/gb/inquirer/news/2008/05/24/gtx260-280-revealed

If I sound so cynical, it's only because I got a 3870x2 last February, but if I'd known the next gen single GPU as fast as this dual GPU would be out in June, I'd have waited. That's because 5 months isn't a long time for me and I could have survived with a 7600gs just a little longer before switching back to ATI.

How often do you guys upgrade GPU's? I generally do so every 2-3 years about six months before CPU upgrades. Some enthusiasts I've talked to want to do it every 6-12 months.



New parts don't come out that quickly. I don't think GTX320 will be out soon after GTX280, or 5780 will be out soon after 4870. I do think that SLI'ing two of those might make sense in the future to make good use of what you've got, but it could be that a simple SLI of 2 GTX280's will beat 2 9800gx2's by a mile while being less power hungry.

I'd thought of CrossfireX with a similarly clocked DDR3 4850 and my 3870x2, with a new motherboard of course, and I might still do it, but a single 4870x2 will be much nicer.
 

Fionn2003

Distinguished
May 27, 2008
24
0
18,510
you mean early or late heh? Early for the price or Late for the gpu? To me its not a huge problem, The 9800gx2 ive seen placed against multiple cards on multiple comparisons, and ive seen it do everything I need it to do to keep up with my high screen monitor. I dont want to SLI, I would rather have a single card and the GX2 does it great not to mention the new drivers came out increasing its performance.

The new chips are not even out yet, they will be at a premium for multiple months and either be near the same performance, or just barely over so its not really that big of a deal breaker for me.
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780


I don't have any complaints about the capability of card you got, only that it will be surpassed by the GTX280 (definitely) and equaled by the GTX260 (probably) without SLI scaling issues in a few weeks.

My only criticisms of Nvidia regarding the 9800gx2 is that they should have done dual GPU on one PCB like ATI's 3870x2. Sometimes, they react to ATI and aren't as forward thinking as they should be.

Regarding launches, ATI actually had the 3850 and 3870 available at a reasonable price within a few weeks of the launch and Nvidia's done well with availability too; though it's hard to get the new card at exactly the MSRP right away.

I've never done true Crossfire or SLI. When I got that 7600gs, it was as a stopgap for an Nvidia 405 chipset barebones (replaced the PSU though) while I was waiting to see how the 8800 and 2900 cards would do. Turns out neither the X2900XT or the 8800gtx 320 were that great in DX10, so I just waited, but switched out for a PCIe x16 budget ATI board when I got the 3870x2.

With that card, I really liked the idea of internal hassle free Crossfire. Either card should be viable for another year or two, especially since AMD and Nvidia are improving their drivers to support internal Crossfire and SLI.

A single 4870 won't be any faster than my card, so I wouldn't think of upgrading right away now; it's just that any single GPU that equals a dual card or dual GPU card is a better solution with no scaling issues. The difference in our situations is that a single GTX280 will be faster than your card.

When to upgrade is subjective, but if you had to make that choice, the card's a good one. IMHO, dual GPU cards are one step towards dual core GPU's.
 

pauldh

Illustrious


I don't recommend buying a GX2 right now by any means mostly for the price and also for the upcoming cards. But it's a beast of a solution for anyone who owns one. To me your expectations for the GTX280 and 260 are very high. I hope you are right. I'd honestly be quite amazed if the GTX280 can do a clean sweep against the GX2, never mind the GTX260. Maybe I'm wrong, I just don't think NV has enough this round to pull that kind of performance leap like G80 did to GF7950 GX2.
 

marvelous211

Distinguished
Aug 15, 2006
1,153
0
19,280
$475 for a x2 card is never a good idea.

I would have bought something to tie me over until GT200 and 4870 release. It would easily outperform these x2 without the negative effects of running dual GPU slapped together.
 

Fionn2003

Distinguished
May 27, 2008
24
0
18,510
everyone always says easily, and from what I heard about these new Nvidia cards coming out they are still bottlenecked and can only stay around 1 gigahertz anyways, so the expectatios that the new cards will totally blow away the higher end cards right now just seems silly when the 8800 hundreds are not marginally better than than the 9800gx2 but on the higher end graphics games I can run a higher AA on the higher resolution monitor without sacrificing to much gaming performance while some of the 8800s cannot run them.

From what I have heard, its not going to be that huge of a difference, and if it is a little faster.. does not matter.. i honestly need something now that is playable for a few years heh. I can't wait months.. I cannot play a lot of my games right now and that is frustrating me lol.