video upgrade, cpu bottleneck

laelomo

Distinguished
Apr 19, 2002
23
0
18,510
i'm currently using a tnt2 16 meg card and am thinking about getting the gf4 4200 when it comes out. these are my specs:

p3 733
asus cusl2c-bp
512mb 133mhz crucial sdram
3com nic, sblive value, wd 20 gig 7200, pioneer dvd....

here're my questions:

1. will the cpu be a big bottleneck if i get the gf4? i don't want to waste money buying that if the cpu is gonna make it perform like a gf2.
2. can i use a 128 mb video card w/ my mobo? it uses the 815e chipset.
3. would it make a big difference if i get a 1ghz p3? will it eliminate the bottleneck or at least show a noticable increase in performance?

i'm pretty happy w/ my setup the way it is except when it comes to gaming. i don't want to upgrade to a p4 or athlon right now cuz of the expense of replacing the ram and mobo along w/ it but i might do a p3 1ghz upgrade and stick the 733 into another machine i have that's running a celeron right now. but that's only a secondary system which i basically only use for burning. i don't want to jump from a 733 to a 1ghz if it only makes a small difference. anyone have any suggestions? thanks.
 

hammad

Distinguished
Apr 17, 2002
20
0
18,510
p3 1 ghz would do the job,733mhz cant handle the geforce 4 ti 4200,for 733 mhz you have to choose gf4 mx 440,i suggest you buy the .13 micron 1.4 ghz p3 or p3 1a ghz.
it would boost the performance from 20 to 30% when playing
return to castle wolfenstein in high resolution.
 

AMD_Man

Splendid
Jul 3, 2001
7,376
2
25,780
Well, the GF4 is currently very CPU intensive (this may be due to the early nature of the current drivers). The R8500 is less CPU dependent and so you can run it on a slower system without lossing too much performance. However, knowing nVidia, they're sure to release faster, more optimized drivers sooner or later.

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
 

laelomo

Distinguished
Apr 19, 2002
23
0
18,510
i don't think the cusl2c boards support the p3s w/ the new core (over 1ghz). am i mistaken? i'd love to get a 1.4 ghz cpu but unfortunately, they weren't released when i bought my mobo.

<P ID="edit"><FONT SIZE=-1><EM>Edited by laelomo on 04/19/02 08:33 AM.</EM></FONT></P>
 

wapaaga

Distinguished
Jan 18, 2001
1,070
0
19,280
gf4 mx is basically a gf2 that has been redone fore better performance

you would bebetter of getting a ti200

what is better then a 7000 rpm, a 8000 rpm delta. to cause more noise to kill your ears :smile:
 

laelomo

Distinguished
Apr 19, 2002
23
0
18,510
a bit off topic:

i had an ati card once several years ago. i called em up for support and some guy picks up and just says, "hello?" after i confirmed that i called ati, i told the hoser i needed tech support, he said the tech was on the phone on another call (there was only 1 tech according to him) and jotted down my number and said he'd have the tech call me back. he called back later but couldn't help w/ whatever my problem was. they both sounded stoned. i haven't dealt w/ anything ati ever since and i always skip over the articles on ati. but i guess they must've gotten their act together if they're the second biggest player in the video card market.
 

laelomo

Distinguished
Apr 19, 2002
23
0
18,510
thanks for the advice everyone. i'm pretty sure nvidia will release better drivers. they seem to do this w/ every new release. i'm leaning towards the ti200. anyone care to speculate if the 4200 will perform better on my system than the ti200 once they release "optimised" drivers?
 

chuck232

Distinguished
Mar 25, 2002
3,430
0
20,780
Well the Ti4200 and the Ti200 will cost about the same and when or if you upgrade your comp, you'll be able to use the extra power from the Ti4200. Basically your call, but I'd get the Ti4200 and as you said Nvidia will almost definitely release new drivers that improve performance on slower systems.
 

DDR64

Distinguished
Apr 16, 2002
27
0
18,530
I think eveyone here is overestimating P3. for P3 733MHz-1GHz, i would think G4MX is okay, but getting G4ti4200 is a big waste of money unless he plans to upgrade his CPU+mobo in near future.
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
I don't know for sure but this might help. <A HREF="http://www.anandtech.com" target="_new">www.anandtech.com</A> has <A HREF="http://www.anandtech.com/video/showdoc.html?i=1608" target="_new">a review of sub-$200 video cards</A>. In the review they give CPU scaling results. They use an Athlon XP clocked from 800mhz to 1733mhz. In some games (Unreal, Quake III, Serious Sam 2) the XP at 800 mhz w/ a Geforce4 Ti4200 performed better the than an XP at 1733 mhz with almost all the other video cards. In these particular games it is clear the video card used was more critical than processor speed.

Another thing to notice with scaling results is how the Geforce4 Ti4200 has high performance with the CPU at 800 but the performance continues to rise as clock speed is brought all the way up to 1733 mhz. The other cards tend to level off much sooner, sometimes as low as 1000mhz. This is true except with the JK2 and Comanche 4 games (but the Ti4200 still leads). This shows the Ti4200 has more to offer when you upgrade the rest of your system later. The Ti4200 scales better.

I'm sure the results depended on the game used, the fact that the XP uses DDR memory at 133mhz, and other factors but the scaling information may help you decide.

Just one note, suspiciously absent from the test was the the Radeon 8500. (Is the retail price over $200?) The 8500LE was there. If I were to guess the 8500 might take the lead in JK2 (because the results were so close) but would still likely trail in the other tests.

<b>I have so many cookies I now have a FAT problem!</b><P ID="edit"><FONT SIZE=-1><EM>Edited by phsstpok on 04/21/02 03:04 PM.</EM></FONT></P>
 

chuck232

Distinguished
Mar 25, 2002
3,430
0
20,780
The Radeon 8500 was there, but only the 64MB version because the 128MB version is significantly more expensive. (over $200US). JK2 and Commanche are extremely CPU limited and only once you get to the extreme top resolutions can you start to see a separation in the scores/FPS. As you could see in JK2's CPU scaling, only the GF2MX and the R7500 topped out before 1733MHz. Everything else is still going strong. Based on all the other scores, I'd say the Ti4200 is able to beat the R8500 at just about everything except for AA performance maybe.
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
The Radeon 8500 was there, but only the 64MB version because the 128MB version is significantly more expensive
I started a new thread to discuss this review and I noticed the Radeo8500 was there. I guess I missed it earlier because the author didn't speak of it in the summary.
As you could see in JK2's CPU scaling, only the GF2MX and the R7500 topped out before 1733MHz. Everything else is still going strong
OK, so maybe the other cards didn't exactly "top out" but the decreased slopes in the curves still show the Ti4200 scales better (at least within the confines of the test).
Based on all the other scores, I'd say the Ti4200 is able to beat the R8500 at just about everything except for AA performance maybe.
This brings up something I was thinking about. It was sort of mentioned that max quality settings were setting were used for each test but "max" was never defined (unless I missed that too). For instance, it was mentioned that the Extreme Quality add-on was used for Serious Sam 2. I believe this addon activates Anisotropy but not FSAA. Now early Geforce cards only have 2-Tap Anistropy but Radeons went up to 64-Tap (not that I know what these mean). I don't know what newer cards can do but different settings would for different cards would skew the results. Without clearly stating the options setting the results are somewhat ambiguous.


<b>I have so many cookies I now have a FAT problem!</b>
 

laelomo

Distinguished
Apr 19, 2002
23
0
18,510
hey everyone, thanks for the advice. i think i'll be getting a ti200. don't wanna mess around w/ gf4 mx if it's just gf2 tech. thanks again
 

DDR64

Distinguished
Apr 16, 2002
27
0
18,530
you know, CPU scaling benchmark can be somewhat misleading in that it reports you average frame rate. In that anandtech article. Faster card indeed gives better fps with 800MHz CPU. The important thing is minimum fps. If a character in 3D shooting game is at a point during a benchmark where CPU is not being a bottleneck than faster card will yield more frames instantaneously(like going through a turnnel, facing a wall, in a small room). That's where the increase in ave. fps comes from, but at a momemnt where CPU is bottlenecked(large open area with lots of other characters), instanteneous frames rate will be same even with faster card.

To put this into perspective, let's apply this in real world situation. With 800MHz CPU, if you are playing a multi-player on-line shooting game, more players in the game means more CPU load. Say you joined a game with max number of players allowed for that particular game. If they all start to shoot in the open area, CPU will crap out and you end up getting sub-20 or even 10 fps(just when it matters and where you'll be playing most of the time). In this case, faster card won't yield more frames. Although you can play with higer resolution and color depth, texture resolution with faster card, you may end up getting no increase in frame rate. So for a particular game, you may be getting 50-60fps in benchmark mode, but in real-time gaming siutation, you may be playing at 10-20 fps most of the time in the same game. That's why i said benchmark can be misleading.
 

chuck232

Distinguished
Mar 25, 2002
3,430
0
20,780
I think they said they took out the valleys and peaks and then averaged the score, so yeah you could be right. Then again, if he paid $100 for a R7500, when he got a new comp later on, he'd be getting 1/2 or maybe less the performance of a Ti4200 which he could have got for $50 more. So even though he may not be able to use it NOW, he'll be able to use it later and save some money by skipping the NV30 or R300 series and going to NV40?, R400?....
 

DDR64

Distinguished
Apr 16, 2002
27
0
18,530
yes, you're right. For the sake of longevity of the investment, it's better to buy a faster DX8 card, given that CPU+mobo upgrade is on the horizon. However, with 733MHz P3, he will probably be disappointed with no substantial increase in frame rate if he expected that. And he has to suffer that till the next upgrade. If the CPU+mobo upgrade can't not be done in near future, the disappointment will last longer. That'll be just sad. Well, at least with G3 or G4, he will be able to play at high resolution and texture detail that he couldn't do before, if it's any consolation.

<P ID="edit"><FONT SIZE=-1><EM>Edited by DDR64 on 04/23/02 10:43 PM.</EM></FONT></P>
 

laelomo

Distinguished
Apr 19, 2002
23
0
18,510
i was considering gf4 but i doubt i'll be doing a cpu upgrade in the near future. i'm guessing the release of the 4200 will drive down the prices of the ti200 and other cards. if the difference in price between the cards is neglegible, i'll probably get the 4200 but if it's more than, say, $50, i'll probably get the ti200. i figure i'll save the extra loot and get a geforce 6 and a 7 ghz processor when they come out. aside from framerate, is there anything else i should consider? would i get better image or texture quality w/ a 4200? i read that the gf4's have improved versions of the features on the gf3 but i assume they translate into subtle improvements on screen and wouldn't be worth the extra money. thanks again for your input.
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
I agree with you completely. THG had an article almost two years ago describing the falicies of framerate benchmarks and how minimum framerates were the real key to performance. A couple of articles later and they were back to the same old routine, and minumum framerates were never mentioned again.

That said, one won't do well if one begins with a video card that performs poorly. If the average framerates are bad then the minimum framerates will be even worse.

<b>I have so many cookies I now have a FAT problem!</b>
 
G

Guest

Guest
I have a 750 MGHz Athlon with a GeForce 2 GTS 32MB card. AMD Irongate chipset ( AGP 2x, 100MHz memory ATA 66). 192 MB memory. My expierience is with Evolva demo. The newer drivers do improve performence. With bumpmap mode turned on the picture is dark. The game will not hanp not hang with the newer drivers. Avg fps is 40-50 fps. It can get up to 130 fps.

My research of articles indicate the more memory you have, the less you depend on AGP support. Apparently only now will the latest now run new 3D games with new functions at adequote fps. Any standard game will run at great fps. Great color. My card will show adequote color and fps In normal, something called shrinkwrap, and 16 and 32 bit color. Something wrap anyway.

For what it's worth I hope this helps. My precompleted opinion is since some co's sell the card with minium requirements go ahead and go with that for color and fps, then you won't tp buy a card when you upgrade the platform/processer. Or opt for an earlier card for less.
 
G

Guest

Guest
One other thing. The articles show improved performence with more memory, and higher bandwidth memory, for advanced 3D features.
 
G

Guest

Guest
I have done a little more reasearch. I ran the Evolva Rolling demo again. In bumpmap mode avg fps is 25 fps. In the demo notes I found out that gamma needs to be corrected for light. This shows you that the card/system was not up to useing all of it's features as they should ideally be used.

The Anand review states 80% of buyers spend a max of $200 on graphics cards.

The GeForce MX line have all the tweaks GeForce 2 should have had.

MadOnion.com will show you how other similarly configured systems run. I would imagine you could find out how well yours will run with various cards from others who have them.

As a final note, when I bought my system the card I have was just released, and evolva was the only game with programing to use my card. The avg game user according to mktg had a 200MHz Pentium, thus games had to be compatible for this type system. This would seem to indicate that anyone will get the max useability on the mainstream card. This with all the tweaks one should have had on the cutting edge. Also Nvivdia's website will show all the 3D games with programming designed to use the card.

Oh I bought my system June of 2000.

Strangely enough Gateway's Accessory Store,s bestselling graphics card id Xtasy's GeForce GTS 200 MX card for a little over $50. The evolva game has been rereleased as a White Label evolva. The last message on their message board is dated Feb of 01.

This should give you an idea of what mktg. has to deal with, and real world usage.