Sign in with
Sign up | Sign in
Your question

About a 8800 GTS 640mb

Last response: in Graphics & Displays
Share
March 30, 2007 3:42:14 PM

My layout :D 

DFI LP SLI-Expert
150GB Raptor X
2x 1GB xms Expert ddr400
OPteron170 OC to 2.5ghz
EVGA 7900GT OC to 585/1790
Antec Neo HE 550watt
_____________________________________
My question is will my Opteron become a bottleneck if I was to purchase a 8800gts 640 mb version GFX card?
Ive been reading lots of articles on the 8800 series needing a really beeefy cpu, is this tru3?

More about : 8800 gts 640mb

April 1, 2007 6:14:35 PM

Quote:
My question is will my Opteron become a bottleneck if I was to purchase a 8800gts 640 mb version GFX card?
Ive been reading lots of articles on the 8800 series needing a really beeefy cpu, is this tru3?

No.

Any modern CPU, preferably a dual core is fine.

People who say you need a X6800 are morons. It won't provide a worthy improvements over a mid-range CPU.
April 2, 2007 4:09:23 PM

1600x1200 :D 

I enjoy playing games under max graphical settings and the most rez that my lcd will handle
Related resources
April 2, 2007 4:13:26 PM

Your opty will be fine for everything except maybe some of the next gen games that really need dual core.

Supreme Conmmander is the first of 'em. It remains to be seen if Crysis will be one, too...
April 2, 2007 6:09:57 PM

Quote:

You cannot just assume that any modern dual core is fine with a G80.


I'll disagree on that. I've been using an 8800 GTX with my e4300, at stock 1.8 GHz speeds even... the system works great. I doubt there's a single game the 8800 GTX/e4300 combo couldn't beat compared to, say, an e6600/X1950 XTX combo at a decent resolution... say 1600x1200.

Sure, there'd be gains from a faster CPU. There will ALWAYS be gains from a faster CPU. But you'd see more of a scaling drop from a slower card.

The lion's share of gaming performance is still in the video card.
a b U Graphics card
April 2, 2007 8:21:34 PM

I'm not getting into the above battle only because I think the participants are both right, but for different things. I think there's confusion about bottleneck (can't get above 20fps minimum) versus holding back from potential (getting 100fps avg versus 150fps).

Blunt, the most important thing to remember is what game/app you expect to be playing. Some are more cpu-centric than others, and others don't even run near 100%.

So while you may be able to do better, it'll likely be quite gameable in most current games.
April 2, 2007 8:28:35 PM

Quote:

I realize that but your missing the point.


Am I?

The point, I would think, is that you would not recommend a person without a high-end dual core CPU to get an 8800 GTX because it would bottleneck them.

My point is that if they can afford the best card they can get, it won't bottleneck them to any significant amount.

Let's consider the results of the article at Tom's
http://www.tomshardware.com/2006/11/29/geforce_8800_nee...

Concentrating on the 1600x1200 resolution - the minimum you'd want to play if you went out and purchased an expensive GTX, right? If you want to play at 1280x1024, you probably don't need an 8800 in the first place...

Doom3: 4xAA 8xAF, 1600x1200
Core2 Extreme 6800 with 8800 GTX: 123 fps
Core2 Extreme 6800 with X1950 XT: 80 fps
Athlon64 FX 60 with 8800 GTX: 108 fps

FEAR: 4xAA 8xAF, 1600x1200
Core2 Extreme 6800 with 8800 GTX: 83 fps
Core2 Extreme 6800 with X1950 XT: 57 fps
Athlon64 FX 60 with 8800 GTX: 79 fps

Oblivion: Outdoors, 1600x1200
Core2 Extreme 6800 with 8800 GTX: 39 fps
Core2 Extreme 6800 with X1950 XT: 22 fps
Athlon64 FX 60 with 8800 GTX: 35 fps


Scenario by scenario, what's the bottleneck? Is it the videocard or is it the processor? Seems pretty obvious that the card is a much bigger bottleneck than the processor. And at higher resolutions in that article, the CPU is even less of a bottleneck.

Seems like if you're going to be playing at high resolutions with eye candy, the processor isn't that much of a bottleneck at all...

...And like I said, if you're playing at 1280x1024 with no AA, why are you paying the extra hundreds of dollars for the 8800 GTX in the first place? At decent resolutions, the processor bottleneck is all but removed.
April 2, 2007 8:37:14 PM

Quote:

Nice try. Both Fear and Oblivion have very little impact on CPU performance. Doom 3 however shows a completely different story and so do the other benches I posted.


What are you saying... the fact that the video card has a much higher impact on performance than the CPU in all 3 games is irrelevant?

Even in Doom3 you take a 43 fps hit with an X1950 but only a 15 fps hit with a slower processor. Are you saying that the CPU is still the primary bottleneck? That doesn't jive with the facts...

Are you arguing that you're better off to invest in a faster CPU and slower videocard that provides a 43 fps hit, rather than just getting a faster videocard that get's you 15fps from the top CPU/card combo?
Where's the sense in that?
April 2, 2007 8:47:42 PM

Quote:
Well Durr dee durr what do you think we have been talking about :lol: 


Value and performance, I thought... :roll:

The OP has an Opty OC to 2.5 Ghz and a 7900 GT. He has enough to buy an 8800 GTX, we'll assume $550.

You're saying his $550 upgrade cash is better spent on something else than a 8800 GTX because of a 'CPU bottleneck'?

What do you suggest instead, that he buys an e6600 platform, mobo, and DDR2 memory for $550 to use with his 7900 GS so he can see a 5% increase in game performance?
Hell, the FX60 and an X1950 XT only got 80fps in Doom3 @ 1600x1200. So what, he might get 70 fps? He's probably getting 65 fps now!

Now you're suggestion is that he does that instead of getting an 8800 GTX that will get him a mere 15fps away from an e6800 extreme/8800 GTXcombo, 108 fps in this instance?

Is there a meaningful bottleneck demonstrated on the FX 60/8800 GTX combo? If so, please point it out...

Of course a faster CPU will result in better performance. But it's so insignificant it could hardly be termed a bottleneck.

Whatever you call it, if you want to argue semantics or not be my guest, but it's blatantly obvious the smart money is on the videocard upgrade...

You complain that people recommend not knowing what they're talking about, but all the evidence suggests you're the one recommending based on flawed logic.
April 2, 2007 8:56:47 PM

Quote:

Stop putting words in my mouth cleeve :roll: I clearly said earlier that a stock E6300 or comparable CPU will provide adequate enough performance for most people when paired up with an 8800GTX.


I thought you said "there was certainly a big bottleneck at the E6400's stock speed."

But, Sure! How much is an e6300? $181 on newegg.
Lets be conservative and say $120 for a mobo, $200 for two gigs of DDR2. We're already at $500.

Even if he had enough left over to go for an X1950 XT (which he doesn't), his gaming performance is still far below what it would have been with a simple 8800 GTX upgrade with his current CPU, without the pain of changing his rig and reinstalling his OS.

But I don't want to put words in your mouth, Rob. You tell US what you suggest he does with his $550 for a gaming upgrade, if the 8800 GTX is such a bad idea because of his CPU bottleneck... :roll:
April 2, 2007 9:03:56 PM

Damn...The War is over?
Quote:
Sure! How much is an e6300? $181 on newegg.
Lets be conservative and say $120 for a mobo, $200 for two gigs of DDR2. We're already at $500.

Even if he had enough left over to go for an X1950 XT (which he doesn't), his gaming performance is still far below what it would have been with a simple 8800 GTX upgrade with his current CPU, without the pain of changing his rig and reinstalling his OS.

But I don't want to put words in your mouth, Rob. You tell US what you suggest he does with his $550 for a gaming upgrade, if the 8800 GTX is such a bad idea because of his CPU bottleneck...

100% Agree. :D 
5 star from me!
April 2, 2007 9:07:36 PM

Quote:

I just did idiot :roll: I said he should get the 8800GTX if he wishes but also suggested that he overclock the CPU as much as he can to get the most out of his 8800GTX.


Lol, you can always tell you've gotten under someone's skin when the name calling comes out. I am honored to have upset you so fundamentally. 8)

BTW, the OP's CPU is already overlcocked, Rob.

I think I'll just requote Prozac's simple but insightful post:

"Any modern CPU, preferably a dual core is fine.

People who say you need a X6800 are morons. It won't provide a worthy improvements over a mid-range CPU."
April 2, 2007 9:09:51 PM

Quote:
Go back and read what I have posted you fukin noob :roll:

Hey Dumb @$$.
i had 700Post with my preview Account (PX7800GT)...i'm not noob.
Quote:
...read what I have posted...

you have posted nothing but Bulls**t.
April 2, 2007 9:10:00 PM

Quote:
stupid sh1t like cleeve


Two insults! That's two more points for me!

I don't want to brag, but you've still got zero, Rob... I've goaded you into three, now. :D 

I love you too much to call you names. :wink:
April 2, 2007 9:11:08 PM

Quote:

I realize that moron


4-0

:twisted:
April 2, 2007 9:13:08 PM

Quote:
Just some **** lover looking to stick his prick up a senior members a$shole :roll: :roll:


I guess can't technically count those as they weren't directly fired at me. :( 

Still, it speaks to my success! :twisted:

[edit]I have to go Rob. We'll consider this one 4-0 in my favor unless you want to call me more names, OK? You can fling them at me in my absence and I'll tally them up later. :)  [/edit]
April 2, 2007 9:16:36 PM

Hey dumbass!
i told you i'm not noob.
you are so angry because you lost the war to Cleeve.
so...
GO F**K YOUR SELF
April 2, 2007 9:23:36 PM

Quote:
hey , what about me :evil: 

Oh,Sorry my friend.my bad :tongue:
Quote:
oh and why the change of account? not like that name?

yeah man, i can't find a way to change my name, so i decided to make a new account.
good to see you all :D 
a b U Graphics card
April 2, 2007 9:25:24 PM

By adding the 8800GTS, you should be able to enjoy most games at that resolution with no problems, Bluntside.

For the arguement going, as long as you can play the game, does an extra 3fps MATTER anyway? A 2.5Ghz AMD dual-core WILL NOT bottleneck a 8800GTS. Bottleneck happens when your machine can't run the game any faster than, let's say, 20fps. Bottleneck doesn't mean you can't run the game at 100fps+! Anyway, anything over 60fps is gravy anyway because your eye can't tell the difference!
a b U Graphics card
April 2, 2007 9:46:16 PM

Quote:
Anyway, anything over 60fps is gravy anyway because your eye can't tell the difference!


Don't let the rest of the thread lul you into thinking statements like that are acceptable.

The human eye can tell the difference between much higher framerates, what may be holding YOU back is YOUR brain or YOUR eyes.
a b U Graphics card
April 2, 2007 9:59:13 PM

I can't tell the difference myself. Then again, as long as the game is smooth, I don't care about what FPS it's running at.
April 2, 2007 10:03:33 PM

Quote:
my point about the LCD's not being able to display is accurate and i will argue that whilst some people may be able to tell that there are frames up to 300(not sure how that was tested) or whatever, they may not be able to tell the difference in gameplay of 80 and 100 especially since FPS is variable with v-sync off. JMO and lets not forget about the OP's capablities.

my friend, what about CRT monitors?
April 2, 2007 10:12:02 PM

Oh, thanks.
a b U Graphics card
April 2, 2007 10:16:16 PM

It's not an argument, there's fact and then there's misconceptions. The above statement is the common myth.

The main thing is understanding the difference between perception which is different from person to person, and sensation which is the physical limits of the body/organs/parts involved.

I find many games fluid at 40fps, but it doesn't mean that it's the hard limit for all things, it depends on so many factors that there is no set number for perception even in the same person, and the set numbers for sensation are way into the triple digits.

I'm just tired of people confusing the two and it's primarily in gaming that people do it and usually as some excuse like there's no benefit to refresh rates outside of 60hz on a monitor.

You'd be just as irked if I said no one needed resolutions above 640x480 because TV looks great at 480i/p

PS I wasn't refering to the OP, I was simply refering to that statement.
April 2, 2007 10:38:15 PM

Quote:
I'm not getting into the above battle only because I think the participants are both right, but for different things. I think there's confusion about bottleneck (can't get above 20fps minimum) versus holding back from potential (getting 100fps avg versus 150fps).

Blunt, the most important thing to remember is what game/app you expect to be playing. Some are more cpu-centric than others, and others don't even run near 100%.

So while you may be able to do better, it'll likely be quite gameable in most current games.


word
April 2, 2007 10:43:47 PM

Quote:
The main thing is understanding the difference between perception which is different from person to person, and sensation which is the physical limits of the body/organs/parts involved.

I find many games fluid at 40fps, but it doesn't mean that it's the hard limit for all things, it depends on so many factors that there is no set number for perception even in the same person, and the set numbers for sensation are way into the triple digits.

I'm just tired of people confusing the two and it's primarily in gaming that people do it and usually as some excuse like there's no benefit to refresh rates outside of 60hz on a monitor.


also agreed, I have a hard time enjoying ANYTHING on a crt below 80Hz, with 85+ prefered.

Games are all over for me, some (like oblivion) can work fine for me at sub 30s, others I need near 60 or more for good times. HL2 engine I actually MUST run vsynch on or I go batty w/ the tearing. Just me though ;) 
a b U Graphics card
April 2, 2007 10:56:16 PM

i have an amd 3700 and an x24400 in two sepperate computers.
have been swapping my 88gts 640mb gpu between them.

and i see no difference in f.e.a.r. oblivion and bf2 according to fraps.
this is on a 16x10 monitor.

now f.s.x. yes.
a b U Graphics card
April 2, 2007 10:59:57 PM

i was watching the c.e.s.07 on hdnet last weekend.

all of the newer monitors are starting to come out with
a refresh of 120hz. they were talking the lcd,s.

and how it will help for (sports) fanatics. :lol: 
a b U Graphics card
April 2, 2007 11:04:35 PM

Quote:
BTW, the OP's CPU is already overlcocked, Rob.


lol yes which will outperform a stock e6400.
April 2, 2007 11:53:01 PM

microsoft's definition of a processor bottleneck:

Quote:
Processor bottlenecks occur when the processor is so busy that it cannot respond to requests for time. Although a high rate of processor activity might indicate an excessively busy processor, a long, sustained processor queue is a more certain indicator. As you monitor processor and related counters, you can recognize a developing bottleneck by the following conditions:

• Processor\ % Processor Time often exceeds 80 percent.

• System\ Processor Queue Length is often greater than 2 on a single-processor system.

• Unusually high values appear for the Processor(_Total)\ Interrupts/sec or System\ Context Switches/sec counters.
a b U Graphics card
April 3, 2007 12:20:21 AM

Quote:

also agreed, I have a hard time enjoying ANYTHING on a crt below 80Hz, with 85+ prefered.


Yep, at work I prefer 75hz personally, but that's because I can't push full 19x14 @ 85, and I pref the res over 16x12 @85.

Quote:
Games are all over for me, some (like oblivion) can work fine for me at sub 30s, others I need near 60 or more for good times. HL2 engine I actually MUST run vsynch on or I go batty w/ the tearing. Just me though ;) 


Yep Oblivion for me is great example (and helps it's one of the few games I still play), I'm fine at 30fps for 'exploring' but in major battles where I truely bog down I need higher fps to understand where the FAQ I am in the battle.
Another example of course would be twitch gaming like UT for me requires high FPS for things like jump fragging and 180deg turns (and still be able to process where you are), yet splinter cell I could probably play at less than 20fps and not notice any issues.

I just get sick of the 60fps myth.
a b U Graphics card
April 3, 2007 12:33:35 AM

Quote:
I'm not getting into the above battle only because I think the participants are both right, but for different things. I think there's confusion about bottleneck (can't get above 20fps minimum) versus holding back from potential (getting 100fps avg versus 150fps).

Exactly my thoughts. Persoanally while scaling might show 100 fps, vs 110 fps, vs 120 fps, etc., it's hard to get worked up about that as a bottleneck. Until it levels off, yes there obviosuly is a system bottleneck to some degree, but does that really matter? Especially when we are talking 16x12 with 4x/8x. Bump up the res, fsaa, and af some more and then what happens. Lets max out AA&AF @ 3840x1024 or 2560x1600, and any single card will tank no matter what we pair it with.

For this reason, I like [H]'s spin on best playable settings scaling. Obviously they painted a complete opposite picture of conroe's launch while using a 7900GTX: http://enthusiast.hardocp.com/article.html?art=MTEwOCwx...

They even paint both sides of the arguement going SLI 8800GTX. Up the res and while often we see a CPU bottleneck to some degree, best playable settings is still determined by the dual 8800GTX. Example, they had to disable motion blur in Carbon, even with 8800GTX SLI: http://enthusiast.hardocp.com/article.html?art=MTI2Miw4...

I am not at all knocking these guys for sticking to their discussion as like you said they all have valid points to some degree. But when it boils down to it the OP has a operon at 2.5GhZ and is asking about gaming with an 8800gts. Sure, a 2.93GHz extreme would out-bench it, and even by alot in many games. But overall, his max playable settings will be way more limited by the GTS than by the cpu.
a b U Graphics card
April 3, 2007 1:26:12 AM

Exactly, but I find my brain is the bottleneck for most games. That's why I play on crap hardware, any faster and I'll experience grey-matter tearing. :twisted:
April 3, 2007 1:42:23 AM

Quote:

also agreed, I have a hard time enjoying ANYTHING on a crt below 80Hz, with 85+ prefered.


Yep, at work I prefer 75hz personally, but that's because I can't push full 19x14 @ 85, and I pref the res over 16x12 @85.

Quote:
Games are all over for me, some (like oblivion) can work fine for me at sub 30s, others I need near 60 or more for good times. HL2 engine I actually MUST run vsynch on or I go batty w/ the tearing. Just me though ;) 


Yep Oblivion for me is great example (and helps it's one of the few games I still play), I'm fine at 30fps for 'exploring' but in major battles where I truely bog down I need higher fps to understand where the FAQ I am in the battle.
Another example of course would be twitch gaming like UT for me requires high FPS for things like jump fragging and 180deg turns (and still be able to process where you are), yet splinter cell I could probably play at less than 20fps and not notice any issues.

I just get sick of the 60fps myth.

I work mostly at 90 hz
but sadly, my videocard doesnt pull that many fps as I would have loved :p 
April 3, 2007 5:05:35 AM

So...what's a bottleneck?? :D 
a b U Graphics card
April 3, 2007 5:16:38 AM

Quote:
So...what's a bottleneck?? :D 


A bottleneck is a device that delivers beer into your mouth.
April 3, 2007 5:50:43 AM

Quote:
So...what's a bottleneck?? :D 


A bottleneck is a device that delivers beer into your mouth.

:lol: 
April 3, 2007 6:11:33 AM

Quote:
So...what's a bottleneck?? :D 


Ok, so that we settle this once and for all, NONE of you here seems to know what a bottleneck is, that is why i reffer you to the image below:



As you can see, there are 3 different kinds of bottlenecks. First you need to decide which one you are talking about, then argue! LOL! :lol: 
April 3, 2007 6:37:32 AM

I don't know about the opty, but with my E6600 i get between 20 to 30% more fps in a most of games (new games) when I O/C than when I run @ stock. That's a HUGE difference. Sure, if you run @ relatively "low" resolutions you may not see the difference, but @ 1920x1200 (my LCD's res.) you noticed it immediately.
April 3, 2007 7:07:19 AM

I have a 8800 paired with a super AMD 64 athlon ,

My question is could the 8800 be blocking the CPU :? :lol: 



How is that for an angle :twisted:
April 3, 2007 8:15:33 AM

Quote:
My layout :D 

DFI LP SLI-Expert
150GB Raptor X
2x 1GB xms Expert ddr400
OPteron170 OC to 2.5ghz
EVGA 7900GT OC to 585/1790
Antec Neo HE 550watt
_____________________________________
My question is will my Opteron become a bottleneck if I was to purchase a 8800gts 640 mb version GFX card?
Ive been reading lots of articles on the 8800 series needing a really beeefy cpu, is this tru3?


Your Opteron will do just fine, so will the 8800 GTS. You are all set.
April 3, 2007 8:41:23 PM

Quote:
First up is Oblivion, where we benchmarked a timed FRAPS run through of a savegame within the Bruma town gates. Oblivion has just as many CPU bound situations as it does GPU bound situations, this benchmark is merely one of the CPU bound cases. We tested Oblivion with the 1.1 patch and the game's default Very High quality settings, which can still be CPU bound with an 8800 GTX at 1600 x 1200:


http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2933&p=8

I don't know what some of you consider a "bottleneck," but I call upgrading to a 600$ graphics card and not being able to play a game that's now a year and a half old at 60fps a problem. With a better processor, you don't run into issues like the above.

I think many around here are arguing semantics, but the facts are that a system equipped with an 8800GTX needs a faster processor to see its full potential. You have to keep an equilibrium in your system with your components. You don't go upgrading the hell out of one and leave the rest of them to rot.

With that said, is an Opty @ 2.5gHz enough to see significant gains from an upgrade to an 8800GTX? Absolutely. But the gains seen from a processor upgrade to even a stock speed E6700 are immense as well, and when you overclock it over 3gHz you can see as much as 30+fps gains in many titles.

Here's one in particular that saw 40+fps gains w/ an 8800GTX paired with a fast Core2Duo in relation to an AMD processor, and it's at 1600x1200 resolution:

Quote:
Battlefield 2 echoes what we've seen elsewhere, although the Intel performance advantage has shrunk to only 11% in the case of the E6700 vs. 6000+.



One could say "yeah, well, that's over 100fps and you won't even notice it" and that's true. However, this same problem will scale down to around 35-40fps in CPU-bound situations in games like F.E.A.R. and a few others that still hit the processor hard.

I think what Rob is trying to say is that a slower processor bottlenecks the entire system when you equip it with a graphics card as strong as an 8800GTX, which it does. If the 8800GTX is so strong that it's running away from everything, and the CPU is not doing the same with the workload that it's working on (physics, A.I., computations), then it is going to be slowing everything down and taking your FPS with it. And if you think Oblivion is bad, just wait until games like Crysis hit with all the advanced physics and computations that will be going on all around you. A 3800X2 AMD processor will be a major bottleneck on your system compared to running a fast, overclocked C2D that can run through calculations nearly twice as fast.

Anyone who argues that is just dumb as a rock.
April 3, 2007 9:10:58 PM

Quote:

Rather or not its a big bottleneck is besides the point because the bottlenecking is still there.


By that logic, everything is a bottleneck. Turing sound on is a bottleneck in a game because framerates will drop 4%, so everybody! Avoid the bottleneck! Turn off sound or it's not worth it to buy the best videocard you can afford! :roll:

Are you arguing for the sake of calling everything that limits performace - no matter how small - a bottleneck?

Bottlenecking implies a meaningful limitation to performance. As the numbers have shown, different CPUs have less of an impact on performance than different videocards.

If you want to call a meaningless differential in performance a 'bottleneck' because you like the semantics argument, go to town.

But in real life, any modern dual-core CPU will serve an 8800 GTX well, unless you're playing at low resolutions...

Prove me wriong with numbers if you can fellows, but try to resist the urge to belittle yourself into flinging bum/penis insults. If it's the only way you know how to argue, then at the very least try to make them clever or funny, eh?... :twisted:
April 3, 2007 9:16:12 PM

Cleeve, check out my post above. A slower processor can make Oblivion take a 25fps hit compared to a faster Core2Duo. I'd call that a "significant impact on performance" and thus, a "bottleneck."
April 3, 2007 9:16:14 PM

Quote:

Anyone who argues that is just dumb as a rock.


Or maybe you could say the same about anyone who makes bottleneck arguments with benchmarks that avoid AA & AF on an 8800 GTX...

What, you're going to spend $550 on a videocard to run it without eye candy? :) 

Gotta go gents, laters...
April 3, 2007 9:17:58 PM

Quote:

Anyone who argues that is just dumb as a rock.


Or maybe you could say the same about anyone who makes bottleneck arguments with benchmarks that avoid AA & AF... :) 

Or better yet, I could say the same about someone who ignores the fact that AA and AF barely make the G80s break a sweat, which is why the pros didn't include it in their benchmarks. Or did you overlook all the performance figures of the cards when they first came out?

4xAA on an 8800GTX will net you ~5fps hit for every 60. AF even less.

And I admire the backpeddling job you're doing here, because just a second ago you said this:

Quote:
But in real life, any modern dual-core CPU will serve an 8800 GTX well, unless you're playing at low resolutions...


Do you consider 1600x1200 a "low resolution?" If not, then you need to take a look at the benchmarks I've given you for 1600 res that show around a 30-35% increase in performance on a better processor.
April 3, 2007 9:19:56 PM

Quote:

Or better yet, I could say the same about someone who ignores the fact that AA and AF barely make the G80s break a sweat.


Excellent!

Then finding appropriate data to back up your claim should be easy, shouldn't it? :) 
April 3, 2007 9:23:41 PM

Quote:
Excellent!

Then finding appropriate data to back up your claim should be easy, shouldn't it? :) 


And to you as well. So.. where is it? Where is your proof that AA and AF enabled even out performance among the CPUs?

I've offered benchmarks to prove my point, and you come back at me with semantics. I will look for some more benches w/ AA/AF applied, but for now you ought to consider the words of someone who actually owns the card in question, along with those who do this kind of thing for a living, like the guys at Anandtech.

[Edit: And something else. The AA/AF argument really doesn't work anyway, because eventually there will come games that do stress the G80 and you will have to cut back the AA/AF to squeeze as many frames as you can out of games. And at that time, if a game is running 20-30fps faster on a better processor, then it's only going to magnify the difference in performance even more.]
April 3, 2007 9:31:09 PM

More #'s:



http://www.anandtech.com/cpuchipsets/intel/showdoc.aspx?i=2866&p=17

No AA/AF applied here obviously, but AA/AF on a G80 isn't going to drag it down 30fps. On average, I'm seeing about 8-10fps less for every 60 with 16xAA enabled on a single GTX when I disable SLI. So if we do the #'s here and drop the theoretical X6800 paired with the 8800GTX to 69fps, that is still a 22fps difference in performance over the slower AMD processor, or 22/47, which is a 47% performance increase.

I'd say a 47% performance increase is significant.
April 3, 2007 9:46:54 PM

Quote:
...eventually there will come games that do stress the G80 and you will have to cut back the AA/AF to squeeze as many frames as you can out of games...

i think if someone has enough money to buy a 8800GTX or two(like you), so it means that he/she can easily pay for better graphic cards when needed(GeForce9800GTX for example).
so there is no point in disabling AA & AF with a/two 8800GTX.
!