Sign in with
Sign up | Sign in
Your question

Let's make it Official: 8800GTX/GTS yay or nay thread.

Last response: in Graphics & Displays
Share
February 6, 2007 7:33:45 AM

Since it seems to be clodded amongst most posters, I figured I'd make an official thread documenting the pros and cons of buying an 8800 card today or after today.

This will be 100% unbiased and only used as a means of gathering information to sway the argument one way or another.

The bottom line is, people need to stop asking whether this card is worth it and if this thread hits 10+ pages, people will start looking at it the second they get here. And at least this way, we can come up with some sort of official knowledge around the board so people stop misquoting others, misinforming build advice seekers and stop making bogus claims.

That being said, this initial post will be edited to contain information regarding the 8800GTX, its lifetime expectancy, its performance, its faults/issues and even more important, its synthetic Direct X 10 benchmarks as they surface.

Pro Arguments:
-Virtually destroys DirectX9 cards in every DirectX9 game, at sometimes double the frame-rates of high-end DirectX9 cards.
-Some future-proofing. Direct X10 support/shaders.
-Best cards for the money as it stands today.
-The GTS can over-clock really well.

Con Arguments:
-Cost. Double the cost of a high-end DirectX9 card for the GTX.
-Performance per dollar. Some DX9 high-end cards compare at high resolutions to the low end 8800 GTS.
-No proof of future-proofing. (Redundant, yes) No real Direct X10 benchmarks, no final Crysis version.
-GTX runs very hot, not a lot of head-room for over-clocking. The most people see out of the core is 20-30 megahertz which isn't a lot compared to other over-clock friendly video-cards. Basically "what you see is what you get" as far as 8800 benchmarks.
-First version of the core, a revision is in order and could likely best the 8800 performance by a large margin for a much lower cost and very soon.

Contributions:
Quote:
Also, while you mention that no one in their right mind would buy an 8xxx series card to play at 10x7 or 12x10 you stress the fact that at low res they're bottlenecked by lower-end CPU's. I don't see the point of that argument in practical terms. It's much more interesting to know what happens with different CPU's at 16x12 or 19x12. I think these are the resolutions most gamers in the market for such high-end components would aim for.


To answer that shortly, in the simplest way possible- An 8800GTX/GTS on anything short of a Core2Duo QX6700 will make your CPU officially the limiting factor of your gaming rig.

For example, on FEAR @ max settings, 4x AA 8x AF, the 8800GTX had neck and neck frame-rates @ resolutions 1600x1200, 2048x1536 and 2560x1600 with Intel that it did with AMD. This means that the Intel Core 2 Duo pushed a higher ceiling of head-room at lower resolutions for the 8800GTX, but when you get to resolutions higher than 16x12, there really isn't a benefit at this time to use an Intel Core 2 Duo with an 8800GTX versus an AMD with an 8800GTX. At the high frame-rates, the 8800GTX does all the work on its own.

This essentially proves what others in the thread have said.

"The 8800GTX is advanced for its time and other hardware/software companies have to play catch-up". This argument is unfortunately neither yay or nay, but gives you a certain structure of caution when thinking of purchasing an 8800GTX. While for some games, the thorough-bred heated beater is literally night and day for high resolution max setting gaming (Oblivion: Graphics card killer), on other games for high resolution gaming you hit the limit whether you're using a 700$ processor or a $350 AMD processor.

Quote:
About that quote, not sure if you meant for the GTX because I and many other people gave the core of the GTS a good 125+mhz OC.

Core from 500 to 645, not problem, crashed some benchmark at 650
Memory: 1600mhz to 2ghz no problem, 2020 will crash some benches.


The GTS has head-room for overclocking, which essentially makes any version of the GTS more viable. The GTX is more of the highlight, because the GTS finds more and more reasons of being a viable solution to go with, as the price is steadily dropping toward the $300 range. If the GTS gets anywhere near $350 before rebates, I consider it a good card, very good in fact. The GTX though, always being above $550 doesn't impress me.

Short List of Facts:
Tomshardware Article shows 8800GTX/GTS dominating DX9 cards in DX9 games.
Tomshardware Article shows 8800GTX/GTS hitting a bottleneck with mid-range C2D processors and any AMD set up.
Guru3d surfaced information regarding 8600 Ultra/GT, low end DX10 solutions from Nvidia.
VRZone surfaced synthetic benchmarks on the latest version of Crysis, showing that it has lackluster performance with a $4000+ rig that has dual 8800GTX's in SLI.**Fake Fact** Official information from Crysis-online negates it. Proof being that the Crysis engine doesn't use Phys cards.

The Facts:
Since 1600x1200 seems to show the largest gap for comparison, this resolution will be used for most comparisons, as in most circumstances, the performance of high-end cards at resolutions of 1024x768 or 1280x1024 is usually really high on all high-end cards, it's hard to measure when really any card can compete at that resolution, and it's not economically sound to use a high-end graphics card on a 1024x768 viewing solution.

Nvidia launched the G80 card early November, 2006. Benchmarks showed great performance compared to high-end Direct X9 cards.
Tomshardware's Initial Review - 8800GTX and GTS

8800GTX- 12,780 3dmarks in 3dmark 2005.
8800GTS - 11,420 3dmarks in 3dmark 2005.
X1950XTX- 9,755 3dmarks in 3dmark 2005.
7900GTX- 8,282 3dmarks in 3dmark 2005.

Tomshardware publishes an article showing that the 8800GTX/GTS needs the fastest processors to show the ceiling for its performance. Anything short of a Core2Duo E6600 over-clocked will inevitably "bottleneck" the performance of an 8800.

Geforce 8800 Needs the Fastest CPU

In Doom 3, at lower resolutions, you actually see the roof of performance at 230 FPS. Basically, a 6800 Extreme Core2Duo couldn't send data fast enough to the 8800GTX to actually give it maximum frames per second. (Not that 230 FPS is visibly lower performance than 300 FPS to the human eye anyway)

Notice also that with AMD set-ups, even high-end AMD set-ups, you can easily hit a bottle-neck at low resolutions even with 1950XTX's and 7900GTX's. Essentially that benchmark shows that an AMD's data cap for Doom 3 is over 100FPS slower than the synthetic cap for a Core 2 Duo 6800 Extreme.

This essentially means for 1280 x 1024 gaming or 1024 x768 gaming, you wouldn't get better FPS with an 8800GTX over a 7900GTX or X1950XTX if you have an AMD set up. Sorry to all you X2 users.

What that article signifies is, that unless you're running a high-end CPU, you will hit a bottleneck with an 8800GTX/GTS that would mean your system with an X1950XTX or 7900GTX would perform comparably.

Information regarding the 8600GT/Ultra surfaced. This shows that Nvidia does have a mid-range DX10 competitor shortly in the horizon. The specs are also rather appealing.

Guru3d information on 8600

This, combined with the information regarding a 320MB 8800GTS, shows that there might be a significant sweet spot in the $179-299 price range for DX10-set buyers.

This link is supposed Crysis latest version benchmarks. It's the latest version of Crysis synthetic benchmark results. It doesn't look pretty.

VRZone's Crysis Benchmark.

EDIT: It has been posted and proven the Chinese bench-marks of Crysis with two 8800GTX's was fake.[/b]

Quote:
A Chinese site recently posted a supposed performance benchmark of Crysis running on a beefy system with 2 8800GTX graphics cards. This report is completely fake for more reasons than one. The main reason being that they used a physX card and indicated that the physX card was enabled in game which is an instant give away that this is fake as Crysis doesn't use Ageia physics.

Don't be fooled by all the different postings around the net, they all track back to the false Chinese article. I just wanted to clear this up after seeing 8 separate articles on many popular sites.

UPDATE:

Crytek forwarded an official statement made by EA and Crytek. Here's what it says...

“We understand there are some rumours in some of the message boards about technology benchmarks as it relates to Crysis and we want to assure you that these are just rumours. We realize that the technical specifications and requirements are a huge topic among our community and when we are prepared to give out our system spec recommendations, you will be the first to find out. Please treat everything you do not hear directly from us as speculation.”


So really, we don't have a beta Crysis benchmark on the 8800GTX's that reflects the performance. Nor do we have a real benchmark of the Crysis engine with Direct X9 API at all.

EDIT: Corrected to translate correct information on higher resolution gaming.

NOTE: "Speculation" on Direct X10 functions and how games will run AFTER Vista optimizations will not be added as it's not really proof but just speculation. The same goes for R600.
February 6, 2007 8:27:27 AM

Good post, or should i say article?;D

I am going for a completely new system in March ( my current is an 1999 AMD 500Mhz...) and I would like to have your opinion: since my budget is USD 2K, and all I want is gaming, should i Stay with a top dx9 card instead of aiming for a G80 series right now?

Thanks,
david, Portugal
February 6, 2007 8:42:24 AM

Quote:

This essentially means for 1280 x 1024 gaming or 1024 x768 gaming, you wouldn't get better FPS with an 8800GTX over a 7900GTX or X1950XTX if you have an AMD set up. Sorry to all you X2 users.


Firstly, nice post. Thanks for the effort you put in.

Regarding the quote: Apparently, the ATI R600 will not be bottle capped so extremely according to the CPU that one has, unlike the new nVidia cards where the fastest CPU is required. I would post the link if I could remember. Anyways, if this is the case, it looks like I will be buying an ATI R600 card then...
February 6, 2007 9:02:58 AM

Quote:
Good post, or should i say article?;D

I am going for a completely new system in March ( my current is an 1999 AMD 500Mhz...) and I would like to have your opinion: since my budget is USD 2K, and all I want is gaming, should i Stay with a top dx9 card instead of aiming for a G80 series right now?

Thanks,
david, Portugal


Olá David!

How did you calculated the US 2K from Euros? Is it 2K€ or less?

Anyway, check the local sellers like AquaPC.com or GlobalData.pt to see the prices. If you have enough money than you should go for 8800 that already supports DX10.
DX9 cards are still good and more affordable and all depends on your money and what use will you give to the PC. The 8800 is a gamer card. If you don't play 'state-of-the-art' games than a DX9 card should be enough.

I think in a few months there will be cheaper DX10 cards, with less performance.

Hope this helps!

Cumprimentos lusos!
a b U Graphics card
February 6, 2007 9:31:23 AM

If I were living and working in the U.S., I would have waited about 3 months to see what was going to happen. I decided to buy an 8800 card now instead of upgrading later. The really fast, really new parts are really expensive in this part of the world.

The Middle East edition of PC Magazine looked at 8800 cards this month. The Foxconn 8800GTS was the least expensive at $435. The XFX8800GTS was $650. The GTX boards tested ranged from $780 to $900.

I have not yet figured out why 3DMark06 will not run on my system, so I used '05 for testing. Testing from 2.4 GHz to 3.6 GHz resulted in surprising results. With EVERYTHING stock - components assembled, no tweaking - 3DMark05 scored 12,948. Disabling SPREAD SPECTRUM in the BIOS yielded the largest increase - 13,880. Increasing CPU core speed to 3.6 GHz gave no significant improvement - 13,963. However, I did see a fairly linear increase in the CPU score - 8224 to 12,246.

I did not try to overclock the GPU core. There isn't much room for improvement there.

This seems to indicate that a C2D system overclocked to just 3.0 GHz will keep an 8800GTS fed. I suspect that a GTX will need a C2D overclocked to "enthusiast" levels.

john
Anonymous
a b U Graphics card
February 6, 2007 1:11:18 PM

Nice post indeed,

I got my GTS before christmas and I am extremly satisfied.

Quote:
Runs very hot, not a lot of head-room for over-clocking. The most people see out of the core is 20-30 megahertz which isn't a lot compared to other over-clock friendly video-cards. Basically "what you see is what you get" as far as 8800 benchmarks.


About that quote, not sure if you meant for the GTX because I and many other people gave the core of the GTS a good 125+mhz OC.

Core from 500 to 645, not problem, crashed some benchmark at 650
Memory: 1600mhz to 2ghz no problem, 2020 will crash some benches.

Running at 75C under full load while being silent @645 is really good(my case probably help a good 5c).

Other then that I do agree with most of the point, I would probably hold off a video card purchase within the next month see what March has to offer which will be a lot!
February 6, 2007 1:14:46 PM

I don't think I'll ever shell out that kind of money on a video card. If I had some comfortable money to spend on a new computer, I'd get an X1950XT, and upgrade later.
February 6, 2007 1:28:10 PM

Quote:
I don't think I'll ever shell out that kind of money on a video card. If I had some comfortable money to spend on a new computer, I'd get an X1950XT, and upgrade later.


I might do just that....

I'm specing a comp together right now, and I might just grab an xt (or a gx2) and wait to see what's coming up...

It's so hard for me to wait to buy something when I really really want to get it ;x
February 6, 2007 1:33:57 PM

By getting the mainstream card now and upgrading later, you can be sure that you get good gameplay now and good gameplay tomorrow. The 8800GTS is powerful now but it will pale in comparison to future cards. There's no point in trying to be future-proof. Get what's available now for a reasonable price, and when it's outdated, upgrade.

Oh, and forget about that cludgy 7950GX2. X1950XT FTW. :wink:
February 6, 2007 1:36:21 PM

well I yay'ed on it. I built a new system in December. At the end of January I found out my 8800GTX had a bad voltage regulator that kept it from updating drivers because the nvidia sentinel kept saying the card did not have enough power. I was running a PC Power and Cooling 750w SLI certified supply and only "one" card. I bought a new 1Kw BFG Quad SLI certified PS and the same thing happened.

I have ordered a new GTX from newegg, a KO version to replace the bad one still in my computer until my RMA gets "approved" by Evga. It was past the 30 days so newegg would not take it back. Once evga decides to send me a replacement, whenever that happens to be I will eventually end up with two 8800GTX cards which I will set up SLI and overclock the sh*t out of just to see what breaks cause evga po'ed me.

So for you guys that think $625 is alot, try well over $1100 and counting.
February 6, 2007 1:55:28 PM

Good points, one thing that is not noted in here is the lack of drivers. I'm not complaining but there's definatly a large share of people doing that out there. I've written an idea on why they aren't done for vista but I never get an answer on it. If anyone is interested though, I'll post it, mind you it's a bit long.
February 6, 2007 2:45:25 PM

The majority of rigs/CPUs can't push it to it's potential, and DX10 support is useless in today's gaming environment. These GPUs have progressed to the point where the remainder of a computer's components and software are playing 'catch-up'.

As far as future proofing, why buy the first DX10 supported card available? By the time you can actually take FULL advantage of this card, it will be available for far less money, and cards in its current price range will be far superior. At this point, I would hope that driver support would no longer be an issue either.

Sooooo... I have to say nay for anyone working within a budget as money should be spent on other, more important components. If you have ~$600 in your budget allocated to a GPU(s), you're better off running a single GTX, than 2 inferior cards in SLI... Either way, I feel that ~95% of gamers can have their needs met by simply running (and OCing, if necessary) vid cards which can be had for less than half the cost of the 8800.

Just my 2 cents... :) 
February 6, 2007 2:57:20 PM

I have to agree with bruce555: The drivers issue, since we're talking about the cards TODAY, is rather important IMO. I just read an article on gaming on Vista ( http://www.extremetech.com/article2/0,1697,2090571,00.a... ) and it seems that even with the new OS a GTS will work fine most of the time. However, the drivers will definitely mature during the coming months so at the moment I would add that issue to the "Cons" list.

Also, while you mention that no one in their right mind would buy an 8xxx series card to play at 10x7 or 12x10 you stress the fact that at low res they're bottlenecked by lower-end CPU's. I don't see the point of that argument in practical terms. It's much more interesting to know what happens with different CPU's at 16x12 or 19x12. I think these are the resolutions most gamers in the market for such high-end components would aim for.
February 6, 2007 3:20:53 PM

Quote:
I have to agree with bruce555: The drivers issue, since we're talking about the cards TODAY, is rather important IMO. I just read an article on gaming on Vista ( http://www.extremetech.com/article2/0,1697,2090571,00.a... ) and it seems that even with the new OS a GTS will work fine most of the time. However, the drivers will definitely mature during the coming months so at the moment I would add that issue to the "Cons" list.

Also, while you mention that no one in their right mind would buy an 8xxx series card to play at 10x7 or 12x10 you stress the fact that at low res they're bottlenecked by lower-end CPU's. I don't see the point of that argument in practical terms. It's much more interesting to know what happens with different CPU's at 16x12 or 19x12. I think these are the resolutions most gamers in the market for such high-end components would aim for.


I added your second paragraph as a contribution to the original post with your name included, as I feel that is an important issue regarding CPU bottle-necking.

Your first paragraph is more of a cross-over subject of Vista gaming and DX10 cards, and really it's mostly Microsoft's fault for any problems G80 owners are having with Vista. I mean, there's a lack of drivers for the 8800GTX/GTS, but really, what can Nvidia do? They don't have any synthetic DirectX 10 programs to use as a base for their driver optimization and Vista's lack of solid driver support at launch is an even bigger thorn in G80 owners' sides.

While I dislike the 8800GTX, I can't lie and say the Vista problems with the 8800GTX are faults of engineering/driver support. They're not. It's all on Bill Gates.
February 6, 2007 3:23:20 PM

I just got a X1950 XT. This should keep me happy until the next-gen DX10 cards come out. Even the 8800 GTX might not be fast enough at 2560x1600. The next gen-cards should be twice as fast as the 8800 GTX, consume less power, and be more mature with more mature drivers. By then, I'll buy Vista 64 and the first REAL DX10/64bit/multi-core games should come out.
February 6, 2007 3:30:13 PM

Quote:
I just got a X1950 XT. This should keep me happy until the next-gen DX10 cards come out. Even the 8800 GTX might not be fast enough at 2560x1600. The next gen-cards should be twice as fast as the 8800 GTX, consume less power, and be more mature with more mature drivers. By then, I'll buy Vista 64 and the first REAL DX10/64bit/multi-core games should come out.


Your argument is only half-way true at the moment though.

The sad reality is, today there aren't processors fast enough to push the 8800GTX to its limit. The Extreme 6800, which benchmarks better for gaming, hit a ceiling for frame-rates at resolutions higher than 1600x1200, even with 4xAA and 8xAF.

If the new 6x50 series processors aren't able to up the ante, you're likely to see no reason to go higher than an 8800GTX even if an R600 beats it in benchmarks for lower resolution graphically intensive games.

Essentially, what people are saying is that, it's a double edged sword. If you can't get higher than 35 FPS @ 2560 x 1600 resolution, whether you use an 8800GTX or an 8900GT or an ATI X2000XT, then why pick a specific of any of them? You can't tell which is the best.

And on the other end of an argument, the lower resolutions. Why choose a high-end at ALL if most cards from high to mid-range can hit 80+ frame-rates at lower resolutions?

It doesn't necessarily knock the 8800GTX, it just kind of shows us that we have NO idea what its actual limitations are at high resolutions, because no processor can push it hard enough.

The sad truth is that, if the R600 is competent to the 8800GTX, we won't see the limits of the R600 either, and then at the current market, the question "Which is better between the 8800GTX and the ATIBlahXT?" will NOT have an answer. The answer will pretty much be, "Whatever's cheaper buddy."
February 6, 2007 4:04:44 PM

I thinks it's a yay. I like Nvidia's design archetecture for this time around. Now they just need to release a driver to take advantage for it. The one thing I think is that Nvidia's independent shader design running independently of the core to do more shader operations than the R600 can (That is if the R600 is 64 unified shaders with 128 shader operations per cycle).
Anonymous
a b U Graphics card
February 6, 2007 4:05:44 PM

Quote:
Essentially, what people are saying is that, it's a double edged sword. If you can't get higher than 35 FPS @ 2560 x 1600 resolution, whether you use an 8800GTX or an 8900GT or an ATI X2000XT, then why pick a specific of any of them? You can't tell which is the best.

And on the other end of an argument, the lower resolutions. Why choose a high-end at ALL if most cards from high to mid-range can hit 80+ frame-rates at lower resolutions?

It doesn't necessarily knock the 8800GTX, it just kind of shows us that we have NO idea what its actual limitations are at high resolutions, because no processor can push it hard enough.

The sad truth is that, if the R600 is competent to the 8800GTX, we won't see the limits of the R600 either, and then at the current market, the question "Which is better between the 8800GTX and the ATIBlahXT?" will NOT have an answer. The answer will pretty much be, "Whatever's cheaper buddy."


That is where DX10 and Vista might play an important role down the line. It might not be obvious or even noticeable with these card because the games won't be there yet, but in the future, I would not be surprised to see new GPU "scale" better with DX10 thanks to it's much more efficient architecture with much less overhead.

What you just stated is one of the actual reason I love my GTS, I can play games like R6:Vegas at 1680X1050 with everything maxed out expect for shadows in medium(doesn't look much better and might give stuttering in heavy scenes). I also have a 1280X1024 that I will be able to use once the game pushes it to far. I really didn't have any needs for a GTX.

I still think it was a 510$Can well invested!

And a side note about the CPU's I think Penryn with 3ghz-4ghz clock will be the one giving a real boost not the E6X50 since it's been shown that the 1333FSB doesnt do much compared to extra mhz!
February 6, 2007 4:08:35 PM

I have an X2 processor and an 8800GTS, and despite the folks who say "anything less than a e6600 will leave you gimped," my setup works very well. My current cpu is an FX-60 clocked at 2.8ghz. It can handle Oblivion with ALMOST all settings maxed out, and viewing distance maxed out, at good FPS. Now if I had a GTX, I have no doubts that I could max everything out and run at good FPS, even with HDR+16AA+16AF. I might still be a few frames slower than an e6600, but it would still look great.

So I guess what I'm saying is that there is a point where your CPU bottlenecks the 8800's, but that point is somewhere in the 2.4ghz-2.6ghz range for AMD's and anything below an Intel e6300 at stocks. I've read a lot of positive feedback from people who have an FX-60, FX-62, 5000, or 5200 and are getting great performance out of their 8800's.

Knowing what resolution you are going to be running at is a HUGE factor in deciding if the 8800's are going to be worth it as well.

OR, in my case, I had a $100 gift certificate to burn, the GTS became as affordable to me as a X1950pro (AR).

There are a million variables to look at to decide if an 8800 is "worth it" to you... CPU, your monitor's resolution, the strength of you power supply, the strength of your wallet supply, and the strength of your stomach when you see the 8800's drop $100 two days after you finally buy one.

I say, if your CPU is decent or overclocked to around 2.5, and you play at resolutions above 1024X768, you won't be disappointed with an 8800. I used to have a 7900GS... the difference is absolutely NIGHT and DAY.

Cheers.


EDIT: YAY.
February 6, 2007 4:53:25 PM

Quote:
I thinks it's a yay. I like Nvidia's design archetecture for this time around. Now they just need to release a driver to take advantage for it. The one thing I think is that Nvidia's independent shader design running independently of the core to do more shader operations than the R600 can (That is if the R600 is 64 unified shaders with 128 shader operations per cycle).


actually 256 per cycle.
February 6, 2007 4:57:35 PM

The 8800GTS is not that expensive any longer and destroys the highest end cards out there. Plus it is DX10 capable. Look at it this way, a 7900GTX costs around $470US new. You can get a used one for around $350. I just purchased a slightly used XFX 8800GTS for $360 and it crushes the 7900GTX. I got something like 8500+ in 3dMark06 with 1Gb of low end RAM. (I sold my DDR500 stuff as I am moving to a new rig this week).

Personally, I set my vid card limit at $350-$370 and run the card(s) for at least 3 years. About a year after the initial new card purchase the prices have dropped, as the next great thing has come along, and I can pick up another card, of the same flavor, usually for about $280, do SLI and get a 60%-80% performance boost. After three years, I sell the cards for what I can get, buy whatever I can at $350-$370 and start the whole process over. Has worked very well for me and , no, you will never be top dog on the benchies, but you can play anything out there with eye-watering quality.
a b U Graphics card
February 6, 2007 5:00:14 PM

I'll go with "nay" on the 8800 series, for now...

I snagged an EVGA 7900GT a while back and have it oc'd to 550/1500 at the moment and it plays everything at 1280x1024 with medium eye candy on my 19"LCD. It would be hard to justify the cost of an 8800 over what I have now, especially since it will take a while for DX10 games to fully hit the market. Plus I learned years ago never to jump on the "bleeding edge gpu" bandwagon and will wait until they release the next version of the G80 or see how the R600 turns out to be.
February 6, 2007 5:49:59 PM

Quote:
I just got a X1950 XT. This should keep me happy until the next-gen DX10 cards come out. Even the 8800 GTX might not be fast enough at 2560x1600. The next gen-cards should be twice as fast as the 8800 GTX, consume less power, and be more mature with more mature drivers. By then, I'll buy Vista 64 and the first REAL DX10/64bit/multi-core games should come out.


Your argument is only half-way true at the moment though.

The sad reality is, today there aren't processors fast enough to push the 8800GTX to its limit. The Extreme 6800, which benchmarks better for gaming, hit a ceiling for frame-rates at resolutions higher than 1600x1200, even with 4xAA and 8xAF.

If the new 6x50 series processors aren't able to up the ante, you're likely to see no reason to go higher than an 8800GTX even if an R600 beats it in benchmarks for lower resolution graphically intensive games.

Essentially, what people are saying is that, it's a double edged sword. If you can't get higher than 35 FPS @ 2560 x 1600 resolution, whether you use an 8800GTX or an 8900GT or an ATI X2000XT, then why pick a specific of any of them? You can't tell which is the best.

And on the other end of an argument, the lower resolutions. Why choose a high-end at ALL if most cards from high to mid-range can hit 80+ frame-rates at lower resolutions?

It doesn't necessarily knock the 8800GTX, it just kind of shows us that we have NO idea what its actual limitations are at high resolutions, because no processor can push it hard enough.

The sad truth is that, if the R600 is competent to the 8800GTX, we won't see the limits of the R600 either, and then at the current market, the question "Which is better between the 8800GTX and the ATIBlahXT?" will NOT have an answer. The answer will pretty much be, "Whatever's cheaper buddy."

You are correct. I need to mention I will get a quad-core yorkfield the same time I will get all this other stuff. Then I expect all will go smoothly. Until then, my E6400 @ 2.9 is good enough.
February 6, 2007 6:42:51 PM



I know they say 4-way, but that's just to the SIMD and not to the shader operations.

This is from dailytech and VR-Zone. Both places I believe to have a card. Definatly with VR's latest picture releases.

"64 4-Way SIMD Unified Shaders, 128 Shader Operations/Cycle"

It's only a couple places that I've heard that it is 256 and that was a long time ago. I've gone on to say why I think this is correct about the 128 shader operations / Cycle. But only if you want to go into it.

"I've edited this"
February 6, 2007 8:44:12 PM

Quote:


The sad reality is, today there aren't processors fast enough to push the 8800GTX to its limit. The Extreme 6800, which benchmarks better for gaming, hit a ceiling for frame-rates at resolutions higher than 1600x1200, even with 4xAA and 8xAF.


Well, most people that get an 8800GTX like myself, would never consider running their C2D at such default pedestrian speeds like 2.9Ghz. I am running 3.5Ghz with a simple E6400 and thats low for many who are into OC'ing their E6300, E6400, E6600 and yes the 6800. So the CPU in a great many of these instances...., is NOT such a limiting factor after all.
February 7, 2007 10:30:28 AM

Quote:
To answer that shortly, in the simplest way possible- An 8800GTX/GTS on anything short of a Core2Duo QX6700 will make your CPU officially the limiting factor of your gaming rig.

For example, on FEAR @ max settings, 4x AA 8x AF, the 8800GTX had neck and neck frame-rates @ resolutions 1600x1200, 2048x1536 and 2560x1600 with Intel that it did with AMD. This means that while the Intel Core 2 Duo pushed a higher ceiling of head-room to the 8800GTX at lower resolutions. (Meaning the Intel processor was the factor in the FPS lead over the AMD when combined with the same GPU) This also concludes that a person even with a 6800 Extreme processor would sometimes not get better frame-rates at really high resolutions when compared to an AMD processor.

This essentially proves what others in the thread have said.


I'm not sure if I understand you correctly but are you saying that a person who hits a ceiling at high resolutions regardless of processor is CPU bound? Because it's actually the opposite; at high resolutions it doesn't make any difference what CPU you're using because the GPU is the LIMITING factor. When a reviewer wants to test a particular CPU's abilities in games they focus on lower resolutions since these can be handled easily by a powerful GPU and therefore the strengths and weaknesses of the CPU in question are revealed. At high resolutions, especially with eye-candy on, the GPU starts to run into trouble and the CPU becomes less of a factor. Let me point you to a FiringSquad article about that very subject: http://www.firingsquad.com/hardware/geforce_8800_gtx_gt... If you read the conclusion you'll see that the GTX is usually CPU bound, the GTS only sometimes, but it heavily depends on the games too. Older titles are obviously no sweat for the GPU and the CPU is the limiting factor. But newer and shader heavy titles scale well with different CPU's because even the mighty 8800 cards start to show their limitations. And it is rather probable that a person who buys a 8800 card today will do so in order to play newer titles as well as the upcoming ones like Crysis and UT2007, at high resolutions and with eye candy on. Of course you might still benefit with a C2D X6800 but if you want to play Oblivion with HDR + 4xAA + 16xAF @ 1920x1200 you can do so better with an X2 3800+ and an 8800GTX than with a FX-62 and a X1950XTX ( http://www.firingsquad.com/hardware/geforce_8800_gtx_gt... )

I'm not actually recommending that anyone in the market for a 8800 card stick with an X2 3800+, I'm just saying that if we're talking about high resolutions then the CPU matters much less than the GPU and even the X2 3800+ owner who wants to play F.E.A.R., Oblivion, CoD2, Crysis etc. at high res + eye candy will see some benefit with a GTX/GTS over a previous gen card.
a b U Graphics card
February 7, 2007 10:52:11 AM

I popped over to the vr-zone site, and I found ... no real news. Well, OK. I learned that R600 is coming in March - or April. R6XX will have three or four models. It will use DDR2, DDR3, or DDR4 memory. R600 will be fast.

Right now, it is a semi-mythical product. I suspect that ATi was aiming for a Christmas '06 release, either ran into technical or performance problems, and decided to go back to the drawing boards. And now they are trying to build marketing buzz - the classic IBM FUD factor. The announcements trickling out imply, "The R600 will be faster. It will be cheaper. It will lower your cholesterol, and it will not rust."

I hope so. As with Intel and AMD, competition is good for us.

john
February 7, 2007 10:54:28 AM

Quote:
To answer that shortly, in the simplest way possible- An 8800GTX/GTS on anything short of a Core2Duo QX6700 will make your CPU officially the limiting factor of your gaming rig.

For example, on FEAR @ max settings, 4x AA 8x AF, the 8800GTX had neck and neck frame-rates @ resolutions 1600x1200, 2048x1536 and 2560x1600 with Intel that it did with AMD. This means that while the Intel Core 2 Duo pushed a higher ceiling of head-room to the 8800GTX at lower resolutions. (Meaning the Intel processor was the factor in the FPS lead over the AMD when combined with the same GPU) This also concludes that a person even with a 6800 Extreme processor would sometimes not get better frame-rates at really high resolutions when compared to an AMD processor.

This essentially proves what others in the thread have said.


I'm not sure if I understand you correctly but are you saying that a person who hits a ceiling at high resolutions regardless of processor is CPU bound? Because it's actually the opposite; at high resolutions it doesn't make any difference what CPU you're using because the GPU is the LIMITING factor. When a reviewer wants to test a particular CPU's abilities in games they focus on lower resolutions since these can be handled easily by a powerful GPU and therefore the strengths and weaknesses of the CPU in question are revealed. At high resolutions, especially with eye-candy on, the GPU starts to run into trouble and the CPU becomes less of a factor. Let me point you to a FiringSquad article about that very subject: http://www.firingsquad.com/hardware/geforce_8800_gtx_gt... If you read the conclusion you'll see that the GTX is usually CPU bound, the GTS only sometimes, but it heavily depends on the games too. Older titles are obviously no sweat for the GPU and the CPU is the limiting factor. But newer and shader heavy titles scale well with different CPU's because even the mighty 8800 cards start to show their limitations. And it is rather probable that a person who buys a 8800 card today will do so in order to play newer titles as well as the upcoming ones like Crysis and UT2007, at high resolutions and with eye candy on. Of course you might still benefit with a C2D X6800 but if you want to play Oblivion with HDR + 4xAA + 16xAF @ 1920x1200 you can do so better with an X2 3800+ and an 8800GTX than with a FX-62 and a X1950XTX ( http://www.firingsquad.com/hardware/geforce_8800_gtx_gt... )

I'm not actually recommending that anyone in the market for a 8800 card stick with an X2 3800+, I'm just saying that if we're talking about high resolutions then the CPU matters much less than the GPU and even the X2 3800+ owner who wants to play F.E.A.R., Oblivion, CoD2, Crysis etc. at high res + eye candy will see some benefit with a GTX/GTS over a previous gen card.

Thats exactly what he said, and u are very right, Primitivus.

I guess he looked at the results at 2560x1600 and though "hmm, both CPUs arent getting above 40 FPS.. must need a better one".

Very funny. Perhaps someone should proof-read the "guide".
February 7, 2007 11:13:18 AM

i think you would have to be stupid to buy a 8800 Now (unless your badly in need of an upgrade or your current GPU breaks). I would guess that the majority of people reading this who are interested in getting a 8800 have currently a 6600gt or better i think that with r600 coming out next month and revisions of the 8800 coming up and with no DX10 games availible (let alone the driver problems to run them on Vista) then also factor in that most people will be looking to buy a whole new computer to do this, C2D is 8 months old now and AMD is on the horizon with K10 nevermind penryn as well i would suggest that people wait until the first DX10 game is released and benchmarks obtained before considering an upgrade.

Normally i would say hell there are always technologies on the horizon you may as well buy what you can afford now. but with the current software/hardware situation m,akeme think otherwise and im going to hold off my build unitl July (Crysis out at the end of June). This Gives me the chance to do some case modding before i get my parts so that it all runs in the case i want
February 7, 2007 11:26:14 AM

You are right. If we only wait 5 months we can a Wolfsdale (Penryn is for laptops) and an 8900 GTX. Or we could wait another 7 months and get the 9800 and the 32nm Conroe.. it never ends.
February 7, 2007 11:30:42 AM

Everyone here is looking at today and not tomorrow when they say that the 8800 series is CPU limited. DX-10 is here, it isn't fully supported but it is here and it is the future. One of the primary changes that DX-10 brings is that it relieves the CPU of many of its graphics related duties as assigned under DX9. If you look at some comparisons between DX-9 and DX-10 you see that the CPU is working much, much less on graphics duties under 10 so how does that relate to the need for a more powerful CPU with a DX-10 card? I do agree that the CPU can limit these cards when running DX-9 since DX-9 depencds so heavily on the CPU but in a few months time DX-10 will be mainstream and the CPU will no longer be a limiting factor per se.

I can't follow the logic as far as it being stupid buying the first DX-10. If you need a card now, and if the price is reasonable and comparable to the 7 series cards, why in the world would you buy a 7 series? Of course, if you wait a few months there will be a new DX-10 card out that will be better but when is that not the case?

No matter when you buy a new vid card you will find yourself with the last great thing in a matter of months since vid cards are continually evolving. If you can wait a few months then absolutely do it as the 8 series price will always be decreasing and there will surely be a great new card on the market soon, but this will be the case six months from now, one year from now and five years from now. As soon as you plunk down the money you are obsolete.

If you really need a card, and the prices have leveled and are reasonable, then get yourself an 8800 series. It is an awesome card and this driver issue will be a forgotten episode in 2-3 months. Don't worry so much about the CPU as DX-10 will adjust that. Don't spend any money on a DX-9 only card, you will kick yourself for that stupid move a few months from now. DX-9 is a technological dead end.
a b U Graphics card
February 7, 2007 12:26:34 PM

Quote:
i think you would have to be stupid to buy a 8800 Now (unless your badly in need of an upgrade or your current GPU breaks). I would guess that the majority of people reading this who are interested in getting a 8800 have currently a 6600gt or better


That's me. Box #2 below was my current computer. If I had had something newer, I would have waited until my next vacation and brought back newer parts.

There is always better (faster or less expensive) parts on the near horizon. But this WOULD be a good time to wait a few months.
Anonymous
a b U Graphics card
February 7, 2007 4:52:24 PM

I agree with you and I said something similar on DX10 and future card.

I must say that right now buying a 8800 is nto the smartest thing with R600 available in a month or so, it is worth the wait, 2 month ago, it wasnt the case!
February 7, 2007 4:59:24 PM

That 600 does sound like a nice card but I am an SLI type so Nvidia is the way to go for me regardless.
February 7, 2007 5:42:30 PM

Word, Playa.
February 7, 2007 6:57:04 PM

The consensus is, today it's a mistake because we're literally almost at the shore line of seeing the R600 launch.

If you want SLI regardless, that's fine, but realize that the R600 launch is significant to you too.

8800 Prices come down, and Nvidia announces it's next G80 high-end card.

It'll either be an 8900GT or maybe an 8850GT or something along those lines.

Whatever it is, it'll be revised and better than the 8800GTX, I can assure you that.
February 7, 2007 7:10:07 PM

I am sure it is. I bought my 8800GTS used for $360 and it runs like a dream so I'm not too worried about missing the next best thing and the 600 sounds superb. That is my personal number for investing in new graphics tech ($350-$370). If you get the right deal at the right time you have some pretty good tech, certainly not the best, and don't want to slit your throat when your $680 investment becomes obsolete a month down the road.

When the card starts to drag a little I just get another, at a much discounted price, and run SLI. Using this plan I am generally able to run anything out there at very high settings for around 2.5-3 years at which time I sell the cards or move one to my wife's rig sell the other and start the process over.

I know I'll never be king of the benchies by spending $360 on a vid card but it works for me. My 6800GTs are really dragging now after 2.5 years and moving to this 8800GTS was awesome for the price. In a few months there will be a bunch of GTS cards on the used market for $250 and I'll grab another and sit tight for a couple of years. Total outlay for 2-3 years of high end graphics is around $600 (XFX has double lifetime warranty so they are great with this plan).
February 7, 2007 7:20:07 PM

Quote:
at high resolutions it doesn't make any difference what CPU you're using because the GPU is the LIMITING factor.


Primitivus = TRUE
February 7, 2007 7:50:11 PM

meh, my thoughts would be that if you have any card within the last year or so... wait until dx10 actually hits and we can see what has the better dx10 performance. The 1900 and 7900 cards both do well enough in dx9 games to hold you until then. AMD still has yet to pull out their dx10 part (no rush looking at vista's current state and lack of dx10 games) and we have no idea what the cards from either company will do there.

only caveat: if your card is crusty-old and you have $ to burn then go ahead. Otherwise looking at the hierarchy of current gpus the ones I just mentioned are very well suited to last a while yet.

JMO of course.
February 7, 2007 7:57:09 PM

Looking to the future I don't know how the R600 will stack up to the G80. I know that ATI has some big advantages over the Nvidia being that this is their 2nd unified design and they're much further ahead of Nvidia in the driver front.

I know this is all speculation because nothing is released and nothing is official but at least the leaked specs at about this time in the development seen correct. The G80’s leaked specs were so I’m thinking these are somewhat correct ignoring any clock speeds.

If we talk about each of their unified shader design then let’s look at ATI’s first. Looking at the leaked specs for this ATI has 64, 4-way SMID units that can perform 128 shading operations per cycle. To me it looks like they have taken the standard way of shading design being that a straight throughput of information is going to be used. Having the unified shaders inline with the normal pipeline with the shading operations running at the core speed is a drawback compared to Nvidia’s design. Nvidia’s design has the shaders running completely independent of the initial core. Being that more work can be done on developing the pixel before sending it to the ROP. These differences probably account for the high memory bit rate and core clock speed of the R600. The high core clock speed also makes up for the R600 only having 16 ROP’s.

The r600 high core clock speed makes up for Nvidia’s independent shader clock speed operation and the high bandwidth is probably for storing pixels on the main memory to account for Nvidia’s independent texture unit’s which allows for storing of information on them to then be recycled back for more stream processing. Simple, look at the G80's 128 SP's @ 1350 Mhz, compared to R600's 128 shader op's per cycle @ 700-1000 Mhz core.

The reason I think I’m right about ATI using a traditional style of “pipeline” design is because they are done with all of their drivers across the board, even Linux. To me this looks like (remember the R600 has been in development for some years now) after they found out what G80 has done with their design and has independently clocked shaders that are independent of the core operation that they had to go to respin after respin to bring up the clock speeds. The initial core design stayed the same allowing for the driver designers to start on their drivers long ago, and the only real obstruction in the driver design being to write for the unified shaders and the general reprocessing of the geometry shaders after the vertex calculations are done thus making the general design of the driver relatively simple to adapt.

I think the reason that Nvidia is taking so long is that they want to completely use the stream processors independency along with high use of the texture filtering units to allow for a much higher throughput of shader operations to the core. Making this happen is a driver design nightmare because of the complete independent design of everything on the G80 (independent shader clock and independent textures) and needing a complete new approach to driver design. Looking at these points I wonder if Nvidia will ever be able to write a driver to allow for complete full potential usage of the G80.

By the way I’m not a Nvidia fanboy, this is the first Nvidia product I’ve owned.
February 7, 2007 8:00:09 PM

davidrio: i think you should get a core2duo 6400, get some good ram, get some other nice things like an xfi sound card, overclock anything you can and then buy the new ati dx10 card that is coming out really soon. you cant go wrong :) 
February 7, 2007 8:17:53 PM

Quote:
The consensus is, today it's a mistake because we're literally almost at the shore line of seeing the R600 launch.

If you want SLI regardless, that's fine, but realize that the R600 launch is significant to you too.

8800 Prices come down, and Nvidia announces it's next G80 high-end card.

It'll either be an 8900GT or maybe an 8850GT or something along those lines.

Whatever it is, it'll be revised and better than the 8800GTX, I can assure you that.


YUP.
!