Sign in with
Sign up | Sign in
Your question

Would running an X800XL in AGP4X cripple the performance?

Last response: in Graphics & Displays
Share
July 28, 2006 10:03:16 PM

I have just noticed that my motherboard (p4s800) supports AGP, but 1.5volts only. Does that mean my choice of video cards is limited? I have ordered a X800XL AGP and I can still cancel the order.
July 29, 2006 12:33:32 AM

Bump
July 29, 2006 1:09:27 AM

thats the amount of voltage your AGP slot gets pumped thru. what u need to find out is the speed of the AGP slot.

i.e x2 / x4 / x8

1.5 v is a later revision if im correct. so my initail thoughts are that you will be fine. double check still...

know what ur MB is ???
Related resources
July 29, 2006 1:54:42 AM

My moterboard supports AGP up to 8X, but only 1.5v.

I have ordered a X800XL AGP that requires 0.8v for 8X, but can run at 4X under 1.5V. So basically I'll need to run it at 4X.

Would that really cripple the performance or is it still worth it to upgrade to that card?
July 29, 2006 1:56:49 AM

You won't notice a difference.
July 29, 2006 1:58:59 AM

im sure you could run it at x8 at a push...
July 29, 2006 2:08:12 AM

Quote:
im sure you could run it at x8 at a push...

Yea, but you won't notice the difference anyway...
July 29, 2006 2:09:03 AM

Quote:
You won't notice a difference.
I will never be able to, cause I'll never run it at 8X :wink:
But I'm glad you say so. Do you know of any comparative benchmark in modern games? Because I know it doesn't make any difference in Quake III, but maybe in Oblivion, because of the higher requirements. :?:
July 29, 2006 2:14:43 AM

theoretically 8x AGP *should* have substantially more bandwidth available to use, making a definite performance improvement... but in real world performance, the difference between 4x and 8x is rather small... 4x will honestly give very similar performance to 8x...

edit: come to think of it, i would think the reason for that would be because most cards until recently never pushed AGPs limits anyhow... so 4x was still more than enough, let alone 8x... ...now if you have a 7800GS or so, that card in itself will begin to push the borders of AGP (though i havent personally read a comparison between 4x and 8x for it)... so, the x800XL might get a boost from 8x, but i dont know how powerful the card is, to know if it would really benefit from it... so, with your card, it might be inconsequential either way, going 4x or 8x, cuz you might still get identical performance, maybe.
July 29, 2006 3:49:29 AM

The X800XL performs very similarly to a 7800GS; I think it is an even better card. But ok I'm not cancelling my order. Thanks for the info!
July 29, 2006 3:58:30 AM

well... they perform similar in certain situations, mainly depending on what Shader model version is supported by the game... if its SM2.0 and lower, the 7800GS will pull ahead... if its SM3.0, the 7800GS will about break even with the X800XL, or even possibly drop behind, because the X800XL supports only SM2.0... so theres less demand placed in it in those situations.

glad i could help though :) , you should be very happy with your cards performance
July 29, 2006 12:41:44 PM

Quote:
you should be very happy with your cards performance
Especially compared with my actual 9600SE - can't wait! :p 

What you said about SM2.0 and 3.0 would explain why the X800XL seem to perform so well in AOE3 and Serious Sam 2 (besting the X1800XL!). But does FEAR integrate SM3.0?
July 29, 2006 1:38:53 PM

yeah he would notice a difference. AGP 4x is so much slower than AGP 8x. I used a geforce 6800 on an agp 4x system that came with pci express (i was transitioning to PCI E) and the card performed like crap compared to the agp 8x i was used to.
July 29, 2006 2:00:56 PM

It more than likely wasnt a TRUE AGP port.

Alot of those combo boards bodge together a few PCI slots to create a psuedo-AGP slot. (as AGP is basically an extension of PCI) The only chipset I know of that actually supports both PCI-E and AGP is the VIA PT880 Pro, a S775 chipset, which supports true AGPx8, however the PCI-E slot is only x4

The difference in performance between AGP4x and AGP8x is about as significant as the difference between PCI-E x8 and PCI-E x16, i.e. negligible.

The only time this is not true is when you start trying to use high detail textures that wont fit in the gfx ram. With a 256mb card you should be ok.
a b U Graphics card
July 29, 2006 2:22:29 PM

Quote:
he would notice a difference. AGP 4x is so much slower than AGP 8x.

That's just not exactly the case... The video performance comes from the vidcard. If the vidcard's memory fills up, it will then 'swap' through AGP - but at that point, the performance will have dropped.
Whether it swaps @ AGP 4x or 8x isn't crucially important - it's rather a factor in the measure of the degree of slowdown.
For example, a vidcard with 256-bit memory bandwidth will be in a lot better shape @ AGP 4x than any 128-bit vidcard will @ AGP 8x.
Regards
a b U Graphics card
July 29, 2006 2:24:41 PM

Quote:
yeah he would notice a difference. AGP 4x is so much slower than AGP 8x. I used a geforce 6800 on an agp 4x system that came with pci express (i was transitioning to PCI E) and the card performed like crap compared to the agp 8x i was used to.

I'd wager Darkstar hit that nail on the head. If you had a true native AGP 8X/ PCI-e 16X mobo like the Asrock 939 Dual SATA II, then you'd see no difference in performance between a 6800U AGP and PCI-e. Matter of fact, with fastwrites, the slight win may go to the AGP version.

ANyway, as some have said, there is little difference between AGP 4X and 8X with the currently available cards & games. I tried my 6800U in both modes and the average framerates in benchmark runs were all within 1 fps of each other.
July 29, 2006 3:10:01 PM

Any of the new AGP cards barely saturates a 2x AGP bus. Making 4x and 8x overkill.
July 29, 2006 6:33:54 PM

Quote:
you should be very happy with your cards performance
Especially compared with my actual 9600SE - can't wait! :p 

What you said about SM2.0 and 3.0 would explain why the X800XL seem to perform so well in AOE3 and Serious Sam 2 (besting the X1800XL!). But does FEAR integrate SM3.0?

as for fear supporting SM3.0 or not, my guess would be yes, that it does... because when looking at the performance between cards running fear, the they all seem to be in line for direct comparisons between SM3.0 and SM2.0 capable gpus

edit: if fear doesnt support SM3.0, then im not sure how to really explain that chart, lol
July 30, 2006 6:45:03 PM

FEAR makes significant use of SM3.0, which is precisely why the 7800GS appears to be slower than the X800 in that game. The benchmark numbers typically give no indication of the extra work the 7800GS is doing.

As for AGP4x vs 8x, in most cases you'll see minimal difference.
July 30, 2006 7:09:14 PM

Quote:
Especially compared with my actual 9600SE - can't wait! :p 

What you said about SM2.0 and 3.0 would explain why the X800XL seem to perform so well in AOE3 and Serious Sam 2 (besting the X1800XL!). But does FEAR integrate SM3.0?
F.E.A.R. and Serious Sam II both use Shader model 3.0, so when comparing a Shader Model 3.0 card to a Shader model 2.0 card, the older card doesn't have to do nearly as much work making it run as fast or faster.
a b U Graphics card
July 30, 2006 7:32:16 PM

Quote:
FEAR makes significant use of SM3.0, which is precisely why the 7800GS appears to be slower than the X800 in that game. The benchmark numbers typically give no indication of the extra work the 7800GS is doing.

And for all that extra work and slower performance, is the SM3.0 path rendering higher quality? What's missing from the SM2.0 path when running fsaa (no soft shadows)?

Here are screenies, can you see any difference at all?
http://www.ixbt.com/video/itogi-video/pics/pics-fear.ht...
a b U Graphics card
July 30, 2006 7:45:23 PM

Quote:
F.E.A.R. and Serious Sam II both use Shader model 3.0, so when comparing a Shader Model 3.0 card to a Shader model 2.0 card, the older card doesn't have to do nearly as much work making it run as fast or faster.

Do you have a link about this? Digit-life mentions running sm3.0 and sm2.0 path in their reviews, screenies look identical. Firingsqaud and Anands Fear performance test doesn't even mention Sm3.0 and tests the X8xx series right along with Sm3.0 cards:

http://www.firingsquad.com/hardware/fear_performance_ma...

http://www.anandtech.com/video/showdoc.aspx?i=2575


There are even doubts that Fear uses SM3.0 at all, so I'd love to see some links that straighten all this out. I keep hearing people say comments like you two just did but have never seen anything to back it up or even look into a difference in IQ, effects, etc. Anand even ran the soft shadow tests on the X800's.
July 30, 2006 8:21:47 PM

SM3.0 and SM2.0 can be equated to resolution and AA differences. for instance, lets say SM2.0 is equal to a resolution of 1280 x 1024 2xAA, and SM3.0 is equal to a resolution of 1600x1200 4xAA... running at a higher resolution and AA setting, your frames per second arent going to be as high (unless of coarse, your card is capable of handling the higher setting without difficulty)

virtually all benchmarks between cards on tomshardware place lower performance running under SM3.0 than under SM2.0, in general

you may not be able to see a difference that completely noticable, black and white, but theres certainly different stress loads placed on your card between the 2 SM versions
a b U Graphics card
July 30, 2006 8:22:57 PM

http://www.anandtech.com/printarticle.aspx?i=2686

Check out this review at Anandtech. The X850XTpe dominates the 7800GS in FEAR with no mention of rendering path difference. Contrast that to SCCT, which does indeed have a SM3.0 path and they mention that in the review.

Honestly, I side with the story that fear does not have a SM3.0 path and all the sm3.0 reguritation is people repeating fud they read somewhere else. Too many trusty review sites have never mentioned sm2.0 vs sm3.0 in fear, only DX8 and DX9 paths.

Anyway, I'm not coming down on you guys, but am just trying to make discussion and dig into this as it keeps getting repeated. If anyone has some proff of SM3.0 in Fear, post the links please.
July 30, 2006 8:25:04 PM

when i looked yesterday, i wasnt able to find anything about fear using SM3.0 either... i had to speculate though, but i wasnt at all sure, it just seemed that that might be the case.
a b U Graphics card
July 30, 2006 8:38:02 PM

I'm doubting it does.

And then looking back to farcry, there was no additional eye candy brought about in patch 1.2 with sm3.0 support. 1.3 added HDR (sm3.0 only). There was a slight performance boost for sm3.0 cards with per pixel vs. multiple shaders. SCCT only has a PS1.1 and PS3.0 path a TWIMTBP launch, but later PS2.0 path was added for the X8xx series.

I honestly think most people just assume there is a big advantage to having an SM3.0 card. Just what is the outcome so far surrounding the Sm3.0 battle. And I have come to think of it's real world difference as supporting OpenEXR HDR or not.

I mean patch 1.2 Sm3.0 in farcry was about a efficiency performance increase. Other games could have extra eye candy associated with Sm3.0. Yet everyone ignores the possible performance increase and assumes that the Sm3.0 path is rendering a btter image and causeing lower performance. I'd like to see a list of what games have extra Sm3.0 eye candy and what games have a performance boost when running the Sm3.0 path. ie. what games use it, how is it used, what is gained, and what is the performance impact.

GrApe? where are you? I need a lesson and some reading material. :wink:
July 30, 2006 11:28:31 PM

Quote:
I honestly think most people just assume there is a big advantage to having an SM3.0 card. Just what is the outcome so far surrounding the Sm3.0 battle. And I have come to think of it's real world difference as supporting OpenEXR HDR or not.
The way technology evolved over the last 2 years (I got my computer in 2004), if I wanted to run SM3.0, I would likely have to upgrade the motherboard (for PCI-Express) , processor (actual wouldn't fit new mobo), and high-end video card. And SM4.0 is supposed to arrive with DirectX10.

So basically, I couldn't care less about SM3.0. Few games make a very interesting use of it, and as I said before the HDR effect (which is not exclusive to 3.0 anyway) is overrated in terms of graphical realism.
!