What Video Card would be best for my CPU with no Bottlenecking

kbhadauria

Distinguished
Nov 8, 2007
6
0
18,510
Hey guys its that time again that I need to upgrade my Video Card -

My Specs

Cpu 939- AMD 3800 x2 dual core Toledo clocked at 2.0 stock currently - if I have to I will oc it to 2.5+
Mobo - Asus A8N-E
Ram - 4 Gb of DDR1 Ram
Hd - 320 GD WD
Case - Antec 900
PSU - 520 hw

I am not going Sli at all and I want to play mordern games such as BioShock and Star Craft II when it comes out.. also like to get my hands on crysis

People have told me to go 8800GT but I think I will Bottleneck for sure.. What card is best at my current 2.0 GHz or if I OC it to at least 2.6 GHz ?

I was thinking 7950 X2 or a 7900 GTX Thanks for your help
 
It wont matter if it bottlenecks or not, Ill say it again, the single most benefit a gamer gets out of upgrading is the graphics card. Period. You may bottleneck your card with that cpu, but getting 20 fps with a 7900gtx vs 35 with a 8800gt even bottlenecked is the question you need to ask yourself. Sure, if you had a 3.6 GHZ QCD you may get 45 FPS, but 35 is STILL better than 25
 

kbhadauria

Distinguished
Nov 8, 2007
6
0
18,510
Oh thanks guys .. should I then OC it from 2.0 to 2.5 for any benefit
Thanks for the quick reply guys much faster than other forums.
 
Perfect answer. I see way too many people asking this question. So what if a higher end GPU won't run to it's absolute full potential? As jaydeejohn said, it will still kick the pants off of a lesser GPU.
I would like to know that I have extra horsepower left in my graphics card if I ever wanted to upgrade other parts, instead of knowing that upgrading is useless since my GPU is maxed already.

 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
Your CPU would be fine with the 8800GT. Sure you may bottleneck the card a bit, but it's very unlikely it's anything you'd actually notice. CPU bottlenecks are generally only observed at low resolutions with low graphics settings.
 

rgeist554

Distinguished
Oct 15, 2007
1,879
0
19,790
Perfect answer. I see way too many people asking this question. So what if a higher end GPU won't run to it's absolute full potential? As jaydeejohn said, it will still kick the pants off of a lesser GPU.
I would like to know that I have extra horsepower left in my graphics card if I ever wanted to upgrade other parts, instead of knowing that upgrading is useless since my GPU is maxed already.
That's not true. I went from a 7900 to an 8800GTS with absolutely no gains. If there were gains, it was 1-2%. You can't just upgrade one part and expect a miracle, everything gets slowed down to the slowest part in your system. After upgrading mobo and CPU, I doubled, and in some cases tripled, frame rates while being able to up the AA/AF settings.
 

kbhadauria

Distinguished
Nov 8, 2007
6
0
18,510
Wait so ^ so you upgraded from a 7900 to a 8800 with no gains? Wow I am getting some mixed opinions ... anyone with a 939 board confirm this.. I am not planing to upgrade till I graduate in 2 years since I bot 4gb of ram
 
Wouldn't this also depend on the resolutions you game at? Lower resolutions require more of the CPU than the higher ones do. I think if you game at anything less than 1280x1024, you may not see much of a difference since it will be CPU bound. If you play higher than that you should see some difference. We need an investigation on this. If only I had the money to experiment with my old Dell.
 

rgeist554

Distinguished
Oct 15, 2007
1,879
0
19,790
Wouldn't this also depend on the resolutions you game at? Lower resolutions require more of the CPU than the higher ones do. I think if you game at anything less than 1280x1024, you may not see much of a difference since it will be CPU bound. If you play higher than that you should see some difference. We need an investigation on this. If only I had the money to experiment with my old Dell.
This statement is right and wrong. It goes both ways. Let me explain how I understand it:

A higher end card can give you more Frames Per Second @ higher resolutions then a lower end card. This does not mean you'll be getting 50 FPS @ 1280x1024 and magically receive 90 FPS @ 1600xWhatever resolution. It just means that the GPU has higher processing power than it's predecessor at higher resolutions.

Let's say as an example (both rigs we'll assume are using the same hardware besides the GPU):

7900 runs @ 1280x1024 w/ 50 FPS
@ 1600x1250 w/ 25 FPS (50% loss because it's relying more on the GPU)

8800 runs @ 1280x1024 w/ 70 FPS
@ 1600x1250 w/ 68 FPS (A very small % drop because the GPU is more powerful)

All in all, your CPU is not going to make you frame rates at a lower resolution less than they would be at a higher resolution. You'll just see less of a performance drop when moving to higher resolutions.
 

kbhadauria

Distinguished
Nov 8, 2007
6
0
18,510
Ok cool guys from what I get I am going to bite the bullet on the 8800Gt - I think 35 fps is fine for me since I will be using it at 1280 x 1024 .. going to see what Bioshock is like. As long as I get above 30 FPS on games anything is fine. Also I will oc my cpu to 2.5-2.8 to balance out the bottleneck.
 

KyleSTL

Distinguished
Aug 17, 2007
1,678
0
19,790
A bottlenecked video card simple means you're getting the absolute most out of your processor. At the current price for an 8800GT, I'd say it's well worth it.
 

bliq

Distinguished



So what if it bottlenecks? then you don't need a new card when you upgrade the proc and you get a nice bump in graphics performance. At the price of a 8800GT, it's worth it.

The alternative is you get a new proc, new mobo, and graphics perf is the same. you'll be disappointed.
 

speedbird

Distinguished
Apr 19, 2007
547
0
18,990


I'm using a socket 939 4400+ X2 2.2Ghz and was previously running a 7800GT card, but with my new 2900Pro I have seen huge gains in performance.
 

rgeist554

Distinguished
Oct 15, 2007
1,879
0
19,790
I'm using a socket 939 4400+ X2 2.2Ghz and was previously running a 7800GT card, but with my new 2900Pro I have seen huge gains in performance.
Going from Single-Core to True Dual-Core (DC) makes a huge difference. You both have x2, or DC processors, so you wouldn't see as much bottlenecking. If you use an older DC processor (Pentium D for example - I think there are Celeron examples as well) that aren't True DC processors, the gains you will see are minimal.

I'll give you my example: I used a Pentium D @ 2.66Ghz and upgraded to the Athalon 64 x2 5000+ @ 2.66Ghz. Same clock speed... and the Pentium D claims to be DC, but when upgrading to the Athalon is when I saw the double-tripling of my frame-rates.
 

KyleSTL

Distinguished
Aug 17, 2007
1,678
0
19,790


You clearly do not understand what you're talking about.
Celeron has NEVER been dual core (but they will in 2008 with the E1000 series). Netburst and Athlon 64 architectural differences are what you're seeing (not this true-dual-core you're talking about). Yes, Pentium D processors weren't great, but their shortcomings were not caused by the two-processors-in-one-package design. This was caused by the horribly long pipeline in the Netburst architecture (among other things). Look up instructions-per-clock and you'll understand why the Pentium Ds were lacking compared to the X2s.
 

rgeist554

Distinguished
Oct 15, 2007
1,879
0
19,790
You clearly do not understand what you're talking about.
Celeron has NEVER been dual core (but they will in 2008 with the E1000 series). Netburst and Athlon 64 architectural differences are what you're seeing (not this true-dual-core you're talking about). Yes, Pentium D processors weren't great, but their shortcomings were not caused by the two-processors-in-one-package design. This was caused by the horribly long pipeline in the Netburst architecture (among other things). Look up instructions-per-clock and you'll understand why the Pentium Ds were lacking compared to the X2s.
I said I think there was an example with Celeron. I know they claimed to use Core Duo architecture in their chips. There was even a post where some guy got tricked into buying a Celeron chip when he asked for a C2D, because the box claimed to be using Core Duo architecture or some crap. (I believe I'm referring to the Celeron 520 - I'll try to find a link for it.)

The Celeron M is not your everyday Celeron processor, given that it derives from the great Pentium M. The 65nm Celeron M is based on the Core Duo. The datasheets provided by Intel indicate that the Celeron M silicon has exactly the same dimensions as the Pentium M/Core Duo.

The distinction between the Celeron M vs Pentium M/Core Duo is that the cache and core is reduced by one-half. But users shouldn't worry too much about the smaller cache. This is due to Pentium M architecture has between 10-12 stages (unconfirmed), making it less dependent on the large cache size (as did the earlier P2 and P3, Celeron kicked ass.)

Source: http://www.dslreports.com/faq/11410

Yes, I know about clock cycles. In fact, I even read a post you recently made that had a chart comparing clock cycles:
Pentium 1 1.1
Pentium MMX 1.2
Pentium 3 1.9
Pentium 4 (Wil) 1.5
Pentium 4 (Nor) 1.6
Pentium 4 (Pre) 1.8
Pentium 4 (Gal) 1.9
Pentium D 2
Pentium M 2.5
Core 2 3
K6 II 1.1
K6 III 1.3
Athlon B 1.9
Athlon C 1.9
Athlon XP 2
Athlon 64 2.3
Athlon 64 X2 2.5
Via C3 0.85
Via C7 0.9

Yes, I also know that Pentium D was a true Dual-Core, it just sucked. I just worded it wrong, I guess as I was just trying to set up a comparison. No, I'm not an expert on the subject and yes, I will admit, I do not 100% understand everything about CPU's. It seems I know enough to make a valid comparison though.
 

KyleSTL

Distinguished
Aug 17, 2007
1,678
0
19,790
You are correct, I remember that post that a stupid salesman told a guy the Celeron 420 was a dual core chip (which it's not). Where it can be confusing is that Celeron D kinda sounds like Pentium D, and that would make one think it's a dual core part (which it's not). The Celeron based on the Conroe-L (420, 430, & 440) is based on Core architecture. The Pentium M also has a cheaper variant called the Celeron M (both of which are based off of Pentium 3 architecture [I know it's not a P3, don't flame me]), both of which are single core. The Core Duo you refer to is indeed based on the Pentium M architecture (and is dual core). So as you can see, naming schemes are impossibly hard (and even harder to keep up with).

Sorry to jump on your case, I just wanted correct information given. And apologies to kbhadauria for the post highjack.

Edit: Core-based Celeron Ms were also made under the names 520, 523, 530, 540, & 550.
 

Hatman

Distinguished
Aug 8, 2004
2,024
0
19,780
Pentium E's I think had 2 threads, pentium d's actually had 2 cores.

Atm no single core cpu can lift the graphics bottleneck on the high end cards, nor can low-end dual cores like amd 4200+ x2's.

You will most likely be bottlenecked with a 939 x2 if you use a 8800GT, but that doesn't mean its performance will magically hit a wall, it'll still rock.

 
GPU vs. cpu is a little complicated.
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2747&p=4

For the MOST part, the gpu will increase the speed more then upgrading the cpu. In certain games however, the cpu will show a very good performance increase, like MS flight simulator, but others will show no or little improvement with a newer cpu (most games NOT limited by cpu).
Wish I could find the article that showed it perfectly, but i can no longer remember where i saw it at.
 

KyleSTL

Distinguished
Aug 17, 2007
1,678
0
19,790
I think we all agree, since he has a dual core Athlon, his system is fine and is deserving of an upgrade, namely with the 8800GT (which the OP already stated is the plan of action).

Ok I guess I am going 8800GT if I can find it in stock - Thanks for the answers.

Ok cool guys from what I get I am going to bite the bullet on the 8800Gt
 

kbhadauria

Distinguished
Nov 8, 2007
6
0
18,510
Thanks for all your help guys.... For those in Canada I found a 8800Gt and got it at 259xx
http://www.futureshop.ca/catalog/proddetail.asp?logon=&langid=EN&sku_id=0665000FS10095590&catid=
Price has gone up .. but I think if you go to the store they well match last nights cost..

Thanks again.