bluntside

Distinguished
Mar 22, 2006
744
0
19,010
My layout :D

DFI LP SLI-Expert
150GB Raptor X
2x 1GB xms Expert ddr400
OPteron170 OC to 2.5ghz
EVGA 7900GT OC to 585/1790
Antec Neo HE 550watt
_____________________________________
My question is will my Opteron become a bottleneck if I was to purchase a 8800gts 640 mb version GFX card?
Ive been reading lots of articles on the 8800 series needing a really beeefy cpu, is this tru3?
 

prozac26

Distinguished
May 9, 2005
2,808
0
20,780
My question is will my Opteron become a bottleneck if I was to purchase a 8800gts 640 mb version GFX card?
Ive been reading lots of articles on the 8800 series needing a really beeefy cpu, is this tru3?
No.

Any modern CPU, preferably a dual core is fine.

People who say you need a X6800 are morons. It won't provide a worthy improvements over a mid-range CPU.
 

cleeve

Illustrious
Your opty will be fine for everything except maybe some of the next gen games that really need dual core.

Supreme Conmmander is the first of 'em. It remains to be seen if Crysis will be one, too...
 

cleeve

Illustrious
You cannot just assume that any modern dual core is fine with a G80.

I'll disagree on that. I've been using an 8800 GTX with my e4300, at stock 1.8 GHz speeds even... the system works great. I doubt there's a single game the 8800 GTX/e4300 combo couldn't beat compared to, say, an e6600/X1950 XTX combo at a decent resolution... say 1600x1200.

Sure, there'd be gains from a faster CPU. There will ALWAYS be gains from a faster CPU. But you'd see more of a scaling drop from a slower card.

The lion's share of gaming performance is still in the video card.
 
I'm not getting into the above battle only because I think the participants are both right, but for different things. I think there's confusion about bottleneck (can't get above 20fps minimum) versus holding back from potential (getting 100fps avg versus 150fps).

Blunt, the most important thing to remember is what game/app you expect to be playing. Some are more cpu-centric than others, and others don't even run near 100%.

So while you may be able to do better, it'll likely be quite gameable in most current games.
 

cleeve

Illustrious
I realize that but your missing the point.

Am I?

The point, I would think, is that you would not recommend a person without a high-end dual core CPU to get an 8800 GTX because it would bottleneck them.

My point is that if they can afford the best card they can get, it won't bottleneck them to any significant amount.

Let's consider the results of the article at Tom's
http://www.tomshardware.com/2006/11/29/geforce_8800_needs_the_fastest_cpu/

Concentrating on the 1600x1200 resolution - the minimum you'd want to play if you went out and purchased an expensive GTX, right? If you want to play at 1280x1024, you probably don't need an 8800 in the first place...

Doom3: 4xAA 8xAF, 1600x1200
Core2 Extreme 6800 with 8800 GTX: 123 fps
Core2 Extreme 6800 with X1950 XT: 80 fps
Athlon64 FX 60 with 8800 GTX: 108 fps

FEAR: 4xAA 8xAF, 1600x1200
Core2 Extreme 6800 with 8800 GTX: 83 fps
Core2 Extreme 6800 with X1950 XT: 57 fps
Athlon64 FX 60 with 8800 GTX: 79 fps

Oblivion: Outdoors, 1600x1200
Core2 Extreme 6800 with 8800 GTX: 39 fps
Core2 Extreme 6800 with X1950 XT: 22 fps
Athlon64 FX 60 with 8800 GTX: 35 fps


Scenario by scenario, what's the bottleneck? Is it the videocard or is it the processor? Seems pretty obvious that the card is a much bigger bottleneck than the processor. And at higher resolutions in that article, the CPU is even less of a bottleneck.

Seems like if you're going to be playing at high resolutions with eye candy, the processor isn't that much of a bottleneck at all...

...And like I said, if you're playing at 1280x1024 with no AA, why are you paying the extra hundreds of dollars for the 8800 GTX in the first place? At decent resolutions, the processor bottleneck is all but removed.
 

cleeve

Illustrious
Nice try. Both Fear and Oblivion have very little impact on CPU performance. Doom 3 however shows a completely different story and so do the other benches I posted.

What are you saying... the fact that the video card has a much higher impact on performance than the CPU in all 3 games is irrelevant?

Even in Doom3 you take a 43 fps hit with an X1950 but only a 15 fps hit with a slower processor. Are you saying that the CPU is still the primary bottleneck? That doesn't jive with the facts...

Are you arguing that you're better off to invest in a faster CPU and slower videocard that provides a 43 fps hit, rather than just getting a faster videocard that get's you 15fps from the top CPU/card combo?
Where's the sense in that?
 

cleeve

Illustrious
Well Durr dee durr what do you think we have been talking about :lol:

Value and performance, I thought... :roll:

The OP has an Opty OC to 2.5 Ghz and a 7900 GT. He has enough to buy an 8800 GTX, we'll assume $550.

You're saying his $550 upgrade cash is better spent on something else than a 8800 GTX because of a 'CPU bottleneck'?

What do you suggest instead, that he buys an e6600 platform, mobo, and DDR2 memory for $550 to use with his 7900 GS so he can see a 5% increase in game performance?
Hell, the FX60 and an X1950 XT only got 80fps in Doom3 @ 1600x1200. So what, he might get 70 fps? He's probably getting 65 fps now!

Now you're suggestion is that he does that instead of getting an 8800 GTX that will get him a mere 15fps away from an e6800 extreme/8800 GTXcombo, 108 fps in this instance?

Is there a meaningful bottleneck demonstrated on the FX 60/8800 GTX combo? If so, please point it out...

Of course a faster CPU will result in better performance. But it's so insignificant it could hardly be termed a bottleneck.

Whatever you call it, if you want to argue semantics or not be my guest, but it's blatantly obvious the smart money is on the videocard upgrade...

You complain that people recommend not knowing what they're talking about, but all the evidence suggests you're the one recommending based on flawed logic.
 

cleeve

Illustrious
Stop putting words in my mouth cleeve :roll: I clearly said earlier that a stock E6300 or comparable CPU will provide adequate enough performance for most people when paired up with an 8800GTX.

I thought you said "there was certainly a big bottleneck at the E6400's stock speed."

But, Sure! How much is an e6300? $181 on newegg.
Lets be conservative and say $120 for a mobo, $200 for two gigs of DDR2. We're already at $500.

Even if he had enough left over to go for an X1950 XT (which he doesn't), his gaming performance is still far below what it would have been with a simple 8800 GTX upgrade with his current CPU, without the pain of changing his rig and reinstalling his OS.

But I don't want to put words in your mouth, Rob. You tell US what you suggest he does with his $550 for a gaming upgrade, if the 8800 GTX is such a bad idea because of his CPU bottleneck... :roll:
 

Farhang

Distinguished
Mar 20, 2007
549
0
18,980
Damn...The War is over?
Sure! How much is an e6300? $181 on newegg.
Lets be conservative and say $120 for a mobo, $200 for two gigs of DDR2. We're already at $500.

Even if he had enough left over to go for an X1950 XT (which he doesn't), his gaming performance is still far below what it would have been with a simple 8800 GTX upgrade with his current CPU, without the pain of changing his rig and reinstalling his OS.

But I don't want to put words in your mouth, Rob. You tell US what you suggest he does with his $550 for a gaming upgrade, if the 8800 GTX is such a bad idea because of his CPU bottleneck...
100% Agree. :D
5 star from me!
 

cleeve

Illustrious
I just did idiot :roll: I said he should get the 8800GTX if he wishes but also suggested that he overclock the CPU as much as he can to get the most out of his 8800GTX.

Lol, you can always tell you've gotten under someone's skin when the name calling comes out. I am honored to have upset you so fundamentally. 8)

BTW, the OP's CPU is already overlcocked, Rob.

I think I'll just requote Prozac's simple but insightful post:

"Any modern CPU, preferably a dual core is fine.

People who say you need a X6800 are morons. It won't provide a worthy improvements over a mid-range CPU."
 

cleeve

Illustrious
Just some **** lover looking to stick his prick up a senior members a$shole :roll: :roll:

I guess can't technically count those as they weren't directly fired at me. :(

Still, it speaks to my success! :twisted:

[edit]I have to go Rob. We'll consider this one 4-0 in my favor unless you want to call me more names, OK? You can fling them at me in my absence and I'll tally them up later. :) [/edit]
 

runswindows95

Distinguished
By adding the 8800GTS, you should be able to enjoy most games at that resolution with no problems, Bluntside.

For the arguement going, as long as you can play the game, does an extra 3fps MATTER anyway? A 2.5Ghz AMD dual-core WILL NOT bottleneck a 8800GTS. Bottleneck happens when your machine can't run the game any faster than, let's say, 20fps. Bottleneck doesn't mean you can't run the game at 100fps+! Anyway, anything over 60fps is gravy anyway because your eye can't tell the difference!
 
Anyway, anything over 60fps is gravy anyway because your eye can't tell the difference!

Don't let the rest of the thread lul you into thinking statements like that are acceptable.

The human eye can tell the difference between much higher framerates, what may be holding YOU back is YOUR brain or YOUR eyes.
 

Farhang

Distinguished
Mar 20, 2007
549
0
18,980
my point about the LCD's not being able to display is accurate and i will argue that whilst some people may be able to tell that there are frames up to 300(not sure how that was tested) or whatever, they may not be able to tell the difference in gameplay of 80 and 100 especially since FPS is variable with v-sync off. JMO and lets not forget about the OP's capablities.
my friend, what about CRT monitors?
 
It's not an argument, there's fact and then there's misconceptions. The above statement is the common myth.

The main thing is understanding the difference between perception which is different from person to person, and sensation which is the physical limits of the body/organs/parts involved.

I find many games fluid at 40fps, but it doesn't mean that it's the hard limit for all things, it depends on so many factors that there is no set number for perception even in the same person, and the set numbers for sensation are way into the triple digits.

I'm just tired of people confusing the two and it's primarily in gaming that people do it and usually as some excuse like there's no benefit to refresh rates outside of 60hz on a monitor.

You'd be just as irked if I said no one needed resolutions above 640x480 because TV looks great at 480i/p

PS I wasn't refering to the OP, I was simply refering to that statement.
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
I'm not getting into the above battle only because I think the participants are both right, but for different things. I think there's confusion about bottleneck (can't get above 20fps minimum) versus holding back from potential (getting 100fps avg versus 150fps).

Blunt, the most important thing to remember is what game/app you expect to be playing. Some are more cpu-centric than others, and others don't even run near 100%.

So while you may be able to do better, it'll likely be quite gameable in most current games.

word