AMD 3700 & 8800 good match?

Karol

Distinguished
Dec 21, 2004
9
0
18,510
Will AMD 3700 bottleneck the 8800 by alot and will it be a good set up for the future?

Thanks in advance.
 

Karol

Distinguished
Dec 21, 2004
9
0
18,510
ok would there be a big diffrance between he GTX and the GTS? Because if is just 10 fps the extra 200$ isin't worth it in my opinion.
 

Tostino

Distinguished
Jun 3, 2006
61
0
18,630
Any 8800 GPUs need Core2 duo 6400 or Athlon X2 5200 at least .
You don't "need" anything that powerful for a 8800. Sure it will give you a little more out of it then a 3700 will, but you do not need to.

I say get the gts, since your cpu is going to be the limiting factor anyways. That way you can save 200$ and really not have any drop in performance.
 

zeapoorte

Distinguished
Jul 15, 2006
64
0
18,630
There is just an article in Tomshardware that discussed about this matter.
"8800 needs faster CPU " or something like this .
You may check it out
 

halbhh

Distinguished
Mar 21, 2006
965
0
18,980
There is just an article in Tomshardware that discussed about this matter.
"8800 needs faster CPU " or something like this .
You may check it out

That old article was ripped to shreds in the comments (at end of article after first few comments).

The flaws in reasoning include the idea that 140fps is somehow better than 100fps to use a random example -- that is, that huge framerates well past the 70 fps limit of the display monitors and the human eye matter to a gamer!

Not!

It's just common sense 150 fps is no better than 100 fps. And then when you get a really big monitor with high resolution on the other hand, then even the mighty 8800gtx is humbled, and can be slowed well below 60fps (like at 2500x resolution in oblivion outdoors for one), and a faster cpu won't help! It's called "gpu bound".

See?

This is were common sense helps. More than just looking at one benchmark number.

This was discovered by me (and others I'm sure -- it wasn't hard to reason out -- no great insight even), and is also in several articles since that time, if you seek them out.

So: go ahead and get that 8800, which ever you want, and have fun, and later, when you finally upgrade your cpu someday, just put your 8800 to work with the new cpu, no prob.

Finally, for those that can't get past the fact that the 3700 isn't enough for a couple of new games (e.g. new flight simulator) really -- so what? He'll upgrade the cpu when he wants, and the 8800 will serve well both now and after the eventual cpu upgrade.
 

scottyboi

Distinguished
Jan 19, 2007
13
0
18,510
i have the same setup right now with a amd athlon 64 3700+ and a 8800 gts, and i run all current games at max everything, (stalker, spiderman3) and they all run pretty smooth. i also get around 6400 in 3dmark 06. its a pretty good combo, and like it was said before u can always upgrade your cpu and stick in your 8800
 

Mandrake_

Distinguished
Oct 7, 2006
355
0
18,780
I have an E6300 and an 8800 GTS and it runs fine. Overclocking the CPU results in only a minor performance increase in games running at 1680x1050 with the eye-candy on.

People with faster dual core CPUs will get much higher results in 3DMark06 than people with single core CPUs, but the difference in most games isn't all that great. Grab the 8800 now. You can always get a dual core CPU later if you want to. :)
 

mpjesse

Splendid
i'll be as sweet as a cinnabon just for you, mmk pumpkin?

if he/she can't figure out how to use this site and what it has to offer before mindlessly creating stupid threads, then he/she deserves to take shite.

you don't know how this works, do you?
 

epsilon84

Distinguished
Oct 24, 2006
1,689
0
19,780
Will AMD 3700 bottleneck the 8800 by alot and will it be a good set up for the future?

Thanks in advance.

In multithreaded games, it will be a bit of a bottleneck I suppose, but you'd have to expect that. This isn't to say that the game won't be playable, it'll just run a bit slower than on a dual core or better CPU.

On single threaded games a 3700+ is more than enough for a 8800GTS, though it may bottleneck the more powerful 8800GTX.
 
Will AMD 3700 bottleneck the 8800 by alot and will it be a good set up for the future?

Thanks in advance.

CPU Intensive games (with alot of ai etc, like RTS etc) will see average performance, graphics intensive games it will see a huge boost and AA and AF performance especially since its totally GPU related.

When they say it needs at least a x6800 etc what they mean is that the video card out powers the cpu, and that you will get better performance from a better cpu, it doesnt mean it wont perform well.


Im doing experiments at home atm, runing socket 7 cpus with an FX5600 video card (and 1gb of ram!), if you run 3DMark2001se with an AMD K6 166mhz (YES 166MHZ, IT DOES RUN!) half the benchmarks will run with decent FPS where as some will not for obvious reasons.
 

Jakc

Distinguished
Apr 16, 2007
208
0
18,680
In every situation, either the CPU, or the GPU is limiting the FPS.
When it's the GPU, you won't have a problem unless you get lower than 30-60 FPS, and when it's the CPU you won't have a problem unless you get lower than 30-60 FPS.

The AMD 3700 will not be a problem for any of todays games.
 
Will AMD 3700 bottleneck the 8800 by alot and will it be a good set up for the future?

Thanks in advance.
Unless you want to play games at 1024X768 or less its all about the GPU. I seen a benchmark with a 1GB, 8800GTX, and X2 3800+ which above 1280X1024 beat a 2GB, 8800GTS, and E6600. Im not putting down Intel here im just saying the GPU is more important than the CPU and memory give a high but reasonable res for the 8800's.
The benchmark is on thg comparing low end v/s mid range.
 

halbhh

Distinguished
Mar 21, 2006
965
0
18,980
Will AMD 3700 bottleneck the 8800 by alot and will it be a good set up for the future?

Thanks in advance.
Unless you want to play games at 1024X768 or less its all about the GPU. I seen a benchmark with a 1GB, 8800GTX, and X2 3800+ which above 1280X1024 beat a 2GB, 8800GTS, and E6600. Im not putting down Intel here im just saying the GPU is more important than the CPU and memory give a high but reasonable res for the 8800's.
The benchmark is on thg comparing low end v/s mid range.

Elbert is right. He's referring to the recent Tom's Hardware Guide article titled something like "system builders marathon" probably day 4, but it's the one were they try out a cheap AMD X2 3800 paired with a very expensive top card: 8800GTX.

The results in the article speak for themselves. And folks in the thread suggesting better performance with better cpus without qualifying their statements carefully simply don't know much about what they are talking about. Just that simple.

One of the biggest problems is not understanding that your nice monitor only does 70 frames per second, and that on the other hand, when the resolution is really high, like 2500x (or even for 1600x1200) for example, the cpu doesn't matter much for 90% of games. Often a top cpu will yield a tiny improvement at these high resolutions over a low end cpu. There are a very few specific exceptions, like the new flight simulator, and to a lesser extent Supreme Comander.

If someone is trying to get a lot of bang for the buck, then it involves more details and information.

It's just reading and learning, and knowledge.
 
I have no problem with the 3700+ (other than the price I paid for one over a year ago).

Why only 3DMark? You should know it doesn't mean anything when it come to real world gaming.

:roll:

I don't know what that means. Please define what "real world gaming" represents. I offered comparative benchmarks of cpu's at 2600MHz and the 8800 GTX & GTS. I was impressed that Zeapoorte *pegged* the similarity between an X2 5200+ @ 2600MHz and Core2 6400 @ 2100 MHz with an 8800gtx.

I also posted scores with a comparable CPU that were similar. The conclusion which could be drawn (which I'm sorry I did not point out clearly) was that the 8800gtx was cpu-bound by a single core AMD cpu (hence the higher score with the 8800gts with a similar cpu @ 2600 MHz).

The Futuremark ORB Project Search allows an individual to compare the performance of their rig as configured to the mobo, cpu, clock speed, graphic subsystem, ram, driver and OS to the millions of systems posted to the database. It is arguably one of the finest *real world* indicators because performance claims are consistent and repeatable under the same conditions and queries can be ranked, examined and compared by hardware and drivers.

I think it's certainly nore imformative and a better indicator of performance than a pretty colored line (no offense, THG!).

The Futuremark BDP includes all the industry *heavyweights*. HA! What do they know ?? :p

NOW - the matter at hand. I agree completely with the post by halbhh - except :) - there is no "...70 fps limit of the display monitors..."

Google vertical sync or refresh rate. A refresh rate of 150MHz would support 140 fps.

The maximum refresh rate is determined by three factors:
1) The rate your video card supports;
2) the rate your monitor supports; and
3) the resolution at which your monitor is set.

Lower resolutions (i.e., 800x600) support higher refresh rates versus higher resolutions (i.e., 1920x1,080).

Visual acuity is actually limited by the interpretative capability of the brain. Your brain cannot concienciously make the distinction between 140fps and 100fps - but your eyes will!

Your brain does not physically **see** the difference but your eyes will 'process' the higher frame rates.

The result??? Your eyes will be less tired and blurry and those brain-pounding headaches from 4 straight hours of gaming will be less severe - lol

Yall have a good day!
 

tamalero

Distinguished
Oct 25, 2006
1,125
133
19,470
what people shouldwatch is the minimum fps, not maximum
anyone can get bursts of 1235325230 fps for a single milisecond
but if your minimum keeps going below the 30 threehold on strategy or 60 in fps, your gameplay will be severely affected.
 

JonnyDough

Distinguished
Feb 24, 2007
2,235
3
19,865
Are you asking for socket 939? If you are and you're looking at buying a cheap single-core processor get the 4000+. It's much better, it's basically the FX-53. These single cores beat out dual cores in single core applications, but the dual cores are better for multi-tasking and applications that can use both cores. Oblivion can use both cores, for instance. I recently did research on this and decided to splurge for the much more costly Opty 185, which can still keep up well with Core2Duo's. The truth is that if you're going single-core you should go with the slower 8800GTS models and save your cash. When they become worth nearly nothing due to new product arrivals, you can OC it and not worry about the warranty anymore (cuz it won't be worth $300). Will a slower CPU bottleneck framerate? Yes. But will the two work together and still get you good framerates in today's games? Yes. Your CPU here will be a bottleneck, but then, so might your ram be (I'd recommend 2 gigs of DDR400 if you're on 939). If you're going socket AM2, the best possible cost/performance choice is the X2 5600 matched with DDR2-800.