Sign in with
Sign up | Sign in
Your question

GT6800 or X800 with Intel PE845 MB

Last response: in Graphics & Displays
Share
January 21, 2005 10:40:21 PM

Here is a good question and I can be swayed either way on depending on the supporting answers.

I have an Asus P4PE (Intel PE845 chipset) with 4X AGP and a 3.06GHz CPU using 512Meg of RAM. I intend to upgrade the board to the latest, greatest (which one?), and last(?) AGP video card.

The contenders seem to be;
Nvidia's GT6800 256MEG ~$377.00 (Leadtek)
ATI's X800 256MEG ~ $382.00 (Sapphire), NO "SE" part, please.

Now the hard part. I really can't see the "ultra" versions of either board with an 865 chipset. And, the ultra price point seems rediculous since the FPS just aren't big enough to really make a difference. Is 40FPS really better than 50FPS at a given resolution? If your at 30FPS, than going to 40FPS may be an advantage if you need to run at that resolution but, the reviews of the two listed cards are all plenty fast enough at insane resolution. And, to get BIG boosts in performance, scaling down to 1280 x 1024 doesn't seem like a hardship later on. That said, I am using a Sapphire 9800pro 128MEG card right now. It's OK, but HALO is kinda rough at 1024 x 768. No bias on manufacture, I've used Nvida and ATI, both.

Is the Nvidia 3.0 (edited from 2.0, sorry) shader technology really better?
Is ATI's legendary game(added "game")image quality still "legendary"?
What card will be the most flexible game wise as time moves on and I'm stuck with the last great AGP cards?
Is there another current AGP architecture(added current) video card (added video card)revolution just around the corner?
Vendor preference on either card other than Sapphire or Leadtek?
Sneaky mods that really work on either card?

Since these seem to be the last AGP cards out there(any arguments on this?), which card is the best card to be stuck with? And, free full version game packages do matter if they are any good.

rower30@earthlink.net



<P ID="edit"><FONT SIZE=-1><EM>Edited by rower30 on 01/21/05 09:15 PM.</EM></FONT></P>
a b U Graphics card
January 21, 2005 11:32:57 PM

Quote:
Is 40FPS really better than 50FPS at a given resolution?

I don't see how that could ever be the case, unless you get visual tearing because your monitor has a max refresh of 40hz.

Higher refreshes are more important for multiplayer games where seeing every change fact enough makes a difference. If you're running while scanning the horizon tryig to paint/tag a player as they do the same, the fluidity of their pan, may mean they can accurately target you first. That's the biggest advantage, other than that it'll likely depend more on what you're compfortable with in the linear gaming of singles player which has less frenetic jittering, and usually requires less need for upper limits.

Quote:
Is the Nvidia 2.0 shader technology really better?

Definitely not, FX sux!

Quote:
Is ATI's legendary image quality still "legendary"?

When did they have Legendary Image quality? They were only better than nVidia, that's no longer the case, they are now equal under equal conditions. In PS2.0 with AA ATi has an Advanatage, but it's minor, and with AF they now use the same technique for full trilinear (both also have optional optimizations).
There are still only Two Legends for Image quality IMO, Matrox for 2D, 3Dlabs for 3Dworkstations. For gaming, ATI and nV are pretty much equal now.

Quote:
What card will be the most flexible game wise as time moves on and I'm stuck with the last great AGP cards?

Well you're not talking about the last great AGP cards, these are below those so the future is different. The GF6800GT would have a better 1+ year future, the GF6800vanilla and X800plain-jane have about the same long-term future except for if you want to run demos, because neither card would have the power for very advanaced features when they come in games based on new engines.

Quote:
Is there another AGP revolution just around the corner?

Definitely not, it'll just die a slower death than we originally thought due to all the delays/problems in the launch of PCIe motherboards (not related to the PCIe part of the MoBo course!! :mad:  ) ATI's bringing the X850AGP line out a little later. Not sure what refresh is in store for nV. The future of AGP beyond that will likely be restricted to Bridged solutions only.

Quote:
Since these seem to be the last AGP cards out there(<i>any arguments on this?</i>)

Yeah, *See Above.*

As for GF6800 mods, this has been discussed here many times, especially recently. Look for it and you will find it.

AND Google works, which will also answer specifics about game packages, as will the Buyer's guide and the links to card reviews therein.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
January 21, 2005 11:56:50 PM

I'm somewhat confused (easy to do) by your answers. The GT6800 and X800 use the same basic chipsets as their respective name sakes "overclocked" ultra versions. So, these seem to be the last of the AGP parts from a family perspective. These cards about double the frame rates over my 9800pro, putting them in upgrade territory.

I believe you thought I suggested that the AGP slot itself was due for an upgrade. Sorry for that, I'm stuck with the AGP slot I have, so a meant are any 4X to 8X cards coming that are meant for the existing AGP slot.

I'm not so sure why 2.0 shader technology sucks, unless its because it is too new to use right now. That's true. When it is used, Why does it suck. though?

The rage for the last several years has been ATI's image quality over Nvidia on GAMES. I'm not so sure there is ANY argument to workstation or 2D. Games are Games and ATI did have a lead there. It seems to me, as you point out, others feel the difference in the cards is diminished to driver quality and not hardware limitations.

I'm still not sure where the "Technical" advantages of either card is, however. What I need is ATI and Nvidia fan boys to whip me to death. Yes, I've read the reviews, but users have interesting and different view points that manufacturer's don't want inquiering minds to know. The two cards are probably "the same" from a layman's point of view. What about the perfectionist's?
Related resources
a b U Graphics card
January 22, 2005 3:48:03 AM

Quote:
The GT6800 and X800 use the same basic chipsets as their respective name sakes "overclocked" ultra versions.

Yeah except they are crippled, and like an R9800SE uses the same basic core as an R9800PRO it doesn't mean it's a very worthwhile upgrade from a GF4ti without the moding Wusy does.
A Plain GF6800 and X800 do NOT give you double the framerate of an R9800Pro unless it's a 128bit pro. And that's the point, the GF6800GT/Ultra and X800XT/XTPE are in the range of that 2X lep from the R9800Pro you speak of, and as such have all those benifits.

Quote:
I believe you thought I suggested that the AGP slot itself was due for an upgrade.

Nope. I think you're reading into what I wrote there.

Quote:
I'm not so sure why 2.0 shader technology sucks, unless its because it is too new to use right now.

It's more about the concept of Shader 2.0 under nV being their FX parts and, well, that sucks. Their Shader 3.0 hardwate is very interesting and handles 2.0 similar to a Radeon cards, very well.

Quote:
It seems to me, as you point out, others feel the difference in the cards is diminished to driver quality and not hardware limitations.

Now both ATI and nV are equal on all fronts, both have driver limitations/issues and IQ parity.

Take a look at the Digit-Life December Digest I posted earlier today (Friday) to see that.

Quote:
different view points that manufacturer's don't want inquiering minds to know.

Well here you go.

SM3.0 on the GF6800vanilla is pointless other than as a slideshow demo of the effects, so for games both cards will be about equally poor at SM3.0 support (heck the GF6800GT and Ultra gt clobbered with HDR). The X800 (never saw and AGP version reviewed as such), isn't going to take you far beyond 2006 either other than at lower reoslution and features. Neith card is a worthwhile upgrade from an R9800Pro if you pay full price and don't get good money for your old card.

You'd be far better off buying a GF6800GT, or something equivalent to an X800XT(non-pe). At least the GF6800GT would be more adept at SM3.0 features, all would do PS2.0 superbly.

As for the perfectionist, the perfectionist probably now has the attitude towards these cardss that I had towards the early feedback of the GF6800Ultra and X800XTPE. Yeah nice, but not worth the money for the performance difference!

The thing will be to find the games YOU like, and find out which of the two plays them fastest, and maybe which interface / driver features (frequency of updates) your prefer.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
January 22, 2005 1:12:25 PM

So far so good.

I was lead to believe that the Nvidia's "new" rostering method for 2.0 is much improved. Seems to be the case based on performance.

The unused 3.0 spec was (is?) supposed to improve efficiency over 2.0. You seem to say, "not so" on this?

As far as what games I like, it still doesn't sway the radically different game engines used to program them. FarCry is a FarCry harder to run than TRON2.0, but both are first person shooters, and I liked them both. The VPU has to deal with a randomly varient set of game engines.

The average FPS are indeed about double what my 9800pro (non SE) part runs at 1024 x 768 on most games of reference. More important, the min FPS are held above 30FPS.
The capability gets markedly worse as the resolutions increase.

From what I see, the GT6800 doesn't seem to be "crippled" any more than a 3.06GHz CPU is relative to a 3.2GHz CPU. Same chip, different outcome on the mask. Pipelines are the same. Memory is venor to vendor varient. BFG seems to be the best bet for good memory selection.

My 9800pro is a good card, but the long term capability of this card is wanting. I can't get above 1024 x 768 on most games with any degree of playability. The XT800 or GT6800 allow MUCH higher initial resolutions and can at least play next gen games at 1024 x 768, or even 800 x 600 for games made after that, maybe.

I agree that too many significant PC bus upgrades may make ANY video card terrible with PCIx 16 coming on board. AGP just can't keep up with that. WHo knows, maybe memory will also make a big leap ahead.
a b U Graphics card
January 22, 2005 2:51:23 PM

Quote:
I was lead to believe that the Nvidia's "new" rostering method for 2.0 is much improved. Seems to be the case based on performance.

This is only vis-a-vis the FX series in my opinion, which basically has allowed it to equal the ATI parts, as they should. There are no significant improved effeciencies, the only major benifit sofar has been geometric instancing and it's possible to do it similarly with the ATI part using 2.0b pathes(See FartCry as an example).

Speaking of FartCry, it's a perfect example of what I'm talking about, all was made of the features that would benifit the nV40 series over the ATIs, yet the ATIs are generally faster, and the only IQ differences are with HDR on which brings even the top card's performance down so much (and you lose AA) that you have to dumb down your resolution settings to make it playable. HDR is possible with ATi cards, however Crytek decided to use a different method than rthdribl, so you won't be able to use it in FartCry, howerever with a GF6800plain you won't be able to use it either realistically speaking.

For that game both cards will be very similar.

Quote:
The average FPS are indeed about double what my 9800pro (non SE) part runs at 1024 x 768 on most games of reference.

Can you provide some support for that, because all evidence I have seen doesn't support that assertion. I think if anything you're confusing cards here, because the GF6800vanilla just isn't going to pull that off except as a rare anomaly.

Look at <A HREF="http://www.digit-life.com/articles2/digest3d/index0412...." target="_new">this digest</A> and show me these figures. And as you crank up the features like AA/AF you will see the gap diminish. The other thing to consider is that it has bee shown that nV's 128mb card aren't going to the highest levels of AA/AF at higher resolutions, and the card/drivers automatically relaxes the quality settings to increase performance, make sure you're not comparing that.

If you are only comparing min. FPS, maybe that'll happen, but it's be something brief and small like 2fps versus 4fps.

And you may find a sweetspot resolution where there will be near 100% performance, but it's not global by any means.

Quote:
The XT800 or GT6800 allow MUCH higher initial resolutions and can at least play next gen games at 1024 x 768, or even 800 x 600 for games made after that, maybe.

800x600 yeah maybe/probably, however no one really knows what it will require, the UnrealEngine 3.0 was demoed on the NV40 running at full Ultra speeds and even with minimal character and background rendering it was at lower resolution and ghugging along. I wouldn't say that it's guaranteed that an GF6800vanilla or plain X800 would handle these at anything aove 640x480 with full effects, however it may run well with reduced effects at even 1024x768 or 12x10+. We just don't know yet.

Quote:
WHo knows, maybe memory will also make a big leap ahead.

Not while the whole card is being held back by the connector and to a much lesser extent the bridge chip. Memory can help somewhat but it still needs to be load to and grabbed from the card before it can be loaded/manipulated by the memory onboard. And there's no turbo cache solution for AGP from what I've seen. Seriously AGP's time has come, it's still possible to use, like ISA is for so many applications, but really the companies should be moving forward not supporting legacy stuff (much to the dislike of many people here), buying a new PCIe board isn't that expensive when we are talking about $300+ graphics cards. And while initial introduction of Intel boards was ridiculously prohibitive, that's no longer the case. AGP cards are a good transition but by the end of the year baring more MoBo delays, AGP will likely loose alot of it's support from the hardware mfrs. But really it has less impact on your card because going from the Vanillas to the GT/Ultra/XTs would offer yet a another noticeable boost.

Quote:
From what I see, the GT6800 doesn't seem to be "crippled" any more than a 3.06GHz CPU is relative to a 3.2GHz CPU. Same chip, different outcome on the mask. Pipelines are the same.

Then you aren't actually talking about a GF6800, you are looking at either the GF6800GT/Ultra or a <i>MODED</i> GF6800. A Plain GF6800 only has 12 active pipelines and slower clocks, the fact that one quad is masked is what makes it a GF6800, and no not all will mod succesfully.

If you're arguing about a moded GF6800 and a moded X800Pro then really you need to learn to be more clear in your statements/questions. A Stock Ford Escort will not beat a Corvette in performnce tests, however a 'stock' Cosworth Escort will easily beat a Factory 'Stock' Corvette, but not a Callaway Vette or a C5-R w/ Nitrous.
Clarity helps alot, and as long as you're talking GF6800/X800 then what I said stands, if you're essentially talking about GF6800GT/X800XT then thhat's a different story and there's alot of information about that already posted here, and it will help you figure out what matches your needs.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
a b U Graphics card
January 22, 2005 3:20:38 PM

Quote:
The average FPS are indeed about double what my 9800pro (non SE) part runs at 1024 x 768 on most games of reference.

I'd have to disagree also. Do you have a 128-bit 9800pro? I think you are comparing your fps on a P4 3.06GHz/ 9800 pro system with online results and a top notch system. You can't do that for a comparison. Look at 1024x768 for Farcry in Grapes above December Digest Link, and you'll see nothing, including a X800XTpe or 2 - GF6800U's in SLI, doubles the framerates of a real 256-bit 9800 pro at plain ol 1024x768.


The X800XTpe and GF6800U would rarely double the framerates at 1024x768. In most cases that is a CPU/system limited resolution. Which means even on an Athlon 64 4000+ system, the cpu is still holding the video card back at times, never mind a P4 3.06 system. Putting the stress on the video card by Turning on AA/AF and cranking the resolutions, then the top cards will surely double the framerates of a 9800 pro. But at those top settings a GF6800 is also going to be unplayable and be demolished by the top 16-pipe cards.

Basically, what I am trying to save you from is thinking that upgrading your video card on your system from a 9800 pro to a GF6800 (or even GF6800GT) is basically going to double your framerates at your desired 1024x768 resolution. It simple isn't going to happen. What a GF6800GT will do is allow you to play at 1280x1024 4x/8x, when a 9800 pro only manages 1024x768.


<A HREF="http://service.futuremark.com/compare?2k3=3400555" target="_new"> My</A>
<A HREF="http://service.futuremark.com/compare?2k1=8268935" target="_new">Gamer</A>
January 23, 2005 3:12:32 AM

OK, you guys seem to know your stuff on cards.

I do mean a BFG or Leadtek 6800GT 256MEG at 350/1000 (370 core on the BFG). And no, not the "hot rod" cards either.

If you look at harder games like FarCry, the acid test for future demands, I see this at your site of reference, which is also what I see elsewhere;
26.9FPS on a 9800pro 128MEG card with 380/680MHz(I have a Sapphire)and 52.1FPS on a 6800GT 256MEG with 350/1000MHz clocks @ 1024 x 768 and AA4x / ANIS-16x (8X)using a AMD3200+ CPU. The system is about what I have now, and the FPS are about double. See;
December 2004 3Digest: FarCry, Research level, 1024x768, AntiAliasing+Anisotropy, Windows XP
http://www.digit-life.com/articles2/digest3d/1204/itogi... (I'm having a devil of a time getting this link to be click-able. I can't find the "paste as link" command. How do you guys past as link?

If I disable AA and ANIS, the frame rate is only about 25% faster with the newer cards. But I don't / won't use the card this way till I HAVE to. And, If I go above 1024 x 768 the difference becomes much larger. So, your site supports my statement that the newer card are about twice as fast when used as intended, and not with all the features turned off. What am I not seeing here?

I thought that the ATI cards had dumbed down pipelines (SE) and memory, just like the Nvidia cards? The X800pro has 12 rendering pipelines and the 6800GT has 16, correct? It is amazing what ATI does with 12 verses 16 pipelines I might add. The card is darn efficient.

So far, it looks like a Nvidia BFG brand 6800GT 256MEG card seems to be the best so far.
a b U Graphics card
January 23, 2005 10:38:34 AM

Most important here, You are talking about GT, NOT a plain GF6800, so your statement about the plain GF6800 being twice as much is not true, like I said, only the GT and Ultra are that close, and even then it's not global.

Anywhoo, with the choice of a GF6800GT, unless the other card is an X800XT (you refer to PRO), then you'd be better off to go with the GF6800GT.

For posting links, use the following method, replacing <> with [].

<url>www.google.com&lt;/url> = <A HREF="http://www.google.com" target="_new">http://www.google.com&lt;/A>

(if you don't have http it will add it, if you do it won't, but it will add http to anything else like s-http link, which may not work because it's added the http to it.(


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
a b U Graphics card
January 23, 2005 11:58:51 AM

Yup, while plain ole <A HREF="http://www.ixbt.com/video2/over2k4-fcs20-1024.shtml" target="_new">1024x768</A> is NOT a resolution to take advantage of the new cards, Turning on AA/AF and/or cranking the resolution is what the new top cards are all about. If you want to game at high resolutions, or 1024 and above with 4XAA/8XAF, then a GF6800<b>GT</b> is a worthwile upgrade from that 9800 pro. I am considering a similar upgrade when the price is right. Until then I'll just sacrifuice resolution/details/AA and OC this 9800 pro as much as is needed. It sure would be nice to be able to just set all games at 1280x1024 4X/8X high details and game away. But a plain 12-pipe low-clocked GF6800 if far from handling that task and is <b>NOT</b> a good upgrade from a 9800 pro unless you just need a second card and get a good deal on the GF6800. It is a pretty decent 1024x768 AA/AF card for games like Farcry and D3, but bump up the resolution and the GT quickly leaves the plain 6800 far behind. If successfully unlocked to 16 pipes and overclocked some, the GF6800 can be a nice performer though and even at stock it's a good upgrade for a GF4Ti owner or Radeon 9600 pro (or less owner) who can't afford the 6800GT.

But for you a GF6800GT would be nice, or a X800pro IF it is significantly cheaper, or a X800XT if it isn't much more. Basically in AGP models, without going with the top dollar ULTRA's or Platinum Editions: X800XT > GF6800GT > X800 Pro > GF6800.


<A HREF="http://service.futuremark.com/compare?2k3=3400555" target="_new"> My</A>
<A HREF="http://service.futuremark.com/compare?2k1=8268935" target="_new">Gamer</A>
<P ID="edit"><FONT SIZE=-1><EM>Edited by Pauldh on 01/23/05 09:04 AM.</EM></FONT></P>
January 24, 2005 11:50:51 PM

Hello again,

I hoped to push aside the "what card" issues by posting the prices, name of the card, vendor name, and "no SE" at the beginning of this post. So, sorry if a left something out. What was it?

Regards,
Galen Gareis

PS - HalfLife2 runs at 1024 x 768 with AA and perspective correction "off" by the default system check with my 9800pro 128 MEG (380 x 680MHz). I suppose this is better than 800 x 600 with AA and perspective correction turned on? This game engine seems OK if things move slow ie sneak games like Thief, system shock ETC. But full on action games? Possible to maybe. So far the game is pretty "dead" on the fast paced movement side. Load times are awful, and the maps are about as big as a pin head. Will more RAM (512MEG to 1Gig) memory (not video) help increase the map size? Load times will be crap no matter what, "it's the hard drive stupid".

Which game engine will make money?
Doom 3 engine - Too system dependant, slow.
FarCry - see above. Map size is OK, loads times OK.
Half-Life 2 - Seems OK with FEW moving objects. But multiple fast action games? Terrible map sizes, terrible load times.
HALO (not II) Engine - now we're talking. This engine is great looking, and plenty of mutiple character action with good frame rates. ZERO map load times and seemless play through and through. Inside, outside, it rocks.

The thing is, it seems, if a game engine can't be sold to build and run other games, and across MANY PC platforms from slow to fast you could go bankrupt. And I don't mean run it with so many features turned off it looks terrible. Soldier of Fortune II looks good and runs on about any PC, for instance. Not too many people can buy games that need 500 or even 300 dollar video cards. I'm not sure where the sweet spot is in the market, but it sure ain't the guys and gals in this post.
a b U Graphics card
January 25, 2005 4:51:33 AM

"No SE" refered to the X800 thus setting the tone, and there's the:
X800SE(8 pipelines)<X800(12)<X800PRO(12)<X800XT(16)<X800XTPE(16)

So saying "no SE" along with the X800 and GF6800 leads most readers to believe you are looking at the 12 pipline cards because of pricing. Anywhoo, so be it.

Loading times can be tweaked on HL2 you have to go to the Valve Forums for tips on that, Priyajeet had some tweaks listed in his thread earlier this year.

As for HALO, the PC version SUX! The XBOX version 1&2 is great and much smoother than HALO on PC. And of course it has the benifits of Co-operative play! The PC version doesn't have that great framerates though. D3 really scales well so really to me it's a much better game for that reason. HL2 is kinda dissapointing for stability reasons IMO.

The game engines that will make money IMO are HL2 and D3. Crytek is tweaking their engine for another release, but I don't know of others licensing it (not that I've investigated it heavily). I think their biggest news was Machinema use of their engine.
The thing about HL2 and D3 is where you can take it beyond it's own version. And while HL2 itself is somewhat small it's not limited for other games nased on the engine, there are already a few games coming to market based on the engine like Vampires which also give it an advantage over all the others listed. And considering that all 3 of the other have been out longer that's impressive. Now whether or not we should be counting since the original release date in Sept of 2003 would be a valid question, however what games are based on the Halo engine, so it's still ahead there. D3's major drawback is CPU requirements, but considering how poor the PC port of HALO has been I'd say that those two are better and FartCry really is only limited by the lack of a 'name brand' to drive adoption by the people who I think were mostly waiting for HL2 and D3 to come out before deciding who to go with in making their game, also I doubt Crytek has the same developer relations guys that Valve and Id have.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
a b U Graphics card
January 25, 2005 5:46:26 PM

OT here Grape, but are you aware of any reviews that include MOHPA? Do you know offhand if that engine will be used in many other games, or just maybe an expansion to mohpa? Just curious as I have googled a little for reviews but only seem to find brief forum talks and not much good reading. I thought originally that this engine would be used in other games and that mohpa benchies would be added to many gpu reviews. Don't stress over searching for anything, just thought I'd ask if you possibly came across some performance numbers for cards with this game/engine since you frequent more review sites than I do.


<A HREF="http://service.futuremark.com/compare?2k3=3400555" target="_new"> My</A>
<A HREF="http://service.futuremark.com/compare?2k1=8268935" target="_new">Gamer</A>
a b U Graphics card
January 25, 2005 7:12:34 PM

I know a little about this. Before it came out there was alot of discussion of how it was a departure from the previous engine (Q3 IIRC?) and that they had built their own engine.

I remember seeing an initial review wih benchies galor, but I don't remember where. I thought it was firingsquad but I didn't find anything yet.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
!