How much difference does FSB make? If, say, I could put the same graphics card in one of these two systems, would there be a small difference or a big difference?
* 3.06 GHz P4 with 133 FSB
* 3.00 GHz P4 with 200 FSB
I would try to turn the graphics detail up high enough in games to maintain 40fps. Differences above 40fps wouldn't matter much to me, it's only a problem if having 133fsb would limit me to below 40fps. The card I'm planning on is a 6800 GT.
BTW I'm using those processors as an example. The two actual systems I can choose between are, A: dual 3.06 Xeons with 133 FSB, B: Athlon XP Mobile 2.3ghz with 200 FSB. (I already have both these systems, I'm choosing which one to upgrade the graphics card in.) If the 133 FSB of the Xeon system is going to hurt me a lot in games, I'll consider upgrading the Athlon system instead.
edit: BTW I know faster FSB would be good to have - question is are we nitpicking tiny percents with it, or are we talking 15% to 20% drop going with 133 FSB?
<P ID="edit"><FONT SIZE=-1><EM>Edited by grafixmonkey on 11/30/04 11:41 PM.</EM></FONT></P>
I wish I could try both! I have two options. I can sell the Athlon system and use that money to buy a Quadro FX 4000 for the xeon workstation, which according to nVidia is between 6800 GT and Ultra performance in games. Or, I can keep using my outdated Quadro4 in the xeon system, and get a 6800 GT for the XP Mobile system, and maybe a replacement motherboard so I can overclock some more.
(surprisingly, the Quadro FX 4000 option ends up being slightly cheaper!)
I finally hammered out the performance of the Quadro 4000 in the Video Cards forum, but still not sure how big a hit the 133 fsb will be. I guess I could swap the Quadro between systems and benchmark that, but the card is so old it has trouble playing anything more demanding than UT2004 on "bleh" quality.
Maybe the dual processor will give enough of a minor boost to counter the FSB, even though games aren't really multithreaded?
Oh well, I'll probably end up getting the Quadro 4000 anyway, just because I should really be spending more time on Maya than on games.
Yup, both cards are exactly the same down to the PCB and memory but a simple resistor is responsible to keep you from installing the FireGL drivers for a gamer's card. That is, unless you use the SoftGL script to override the safeguards that are supposed to prevent you from doing that without having to void any warrenty.
FSB doesn't matter that much on Amd systems, they aren't so memory bound. On the other hand, a 2.2 ghz xp chip is as good at gaming as a 3ghz intel chip, so the xp-m system would be 5% to 10% faster than the xeon, in games. (if the game is cpu intensive, the 10% applies, games that dont use much cpu would not be effected either way)
How about buying a Radeon 9800 and using RivaTuner to turn it into a FireGL X1 ?
And in case you didn't know, any nVidia cards can be modded to a Quadro series.
The core of the Quadro FX 4000 is actually a different graphics core. It is neither a Geforce FX core, Quadro FX core, nor a 6800 core. Its codename is the NV40GL. That makes me think it's not possible to mod a 6800 Ultra into a Quadro 4000, but if you know otherwise I'm interested.
The FireGL cards have performed below the Quadro cards in the benchmarks I've seen, by large margins. I'm starting to model with really high poly-count surfaces, subdivision surfaces, and the new live display of bump and specular mapping (using pixel shaders), so a new Quadro will help me a lot...
And I'm already gaming with a radeon 9800 pro, and not getting very good performance out of it for some reason... maybe expecting too much. At least it played HL2 through with good detail on 1024x768.
To reiterate - Do you think there will be a BIG difference or a SMALL difference? I know that 200FSB is better. My question is, is it worth so much that it's not worth getting the Quadro 4000 if I have to deal with 133 FSB, and I should keep trudging along with the Quadro4 and get a 6800 GT for my other system?
At this point, I'm still leaning towards the Quadro, unless more people vote that the 133 FSB will be debilitating.
<P ID="edit"><FONT SIZE=-1><EM>Edited by grafixmonkey on 12/01/04 09:19 PM.</EM></FONT></P>
OK never mind everyone, I set some time aside and did some mathematical analysis with the help of some stuff I stumbled on at Anandtech and some of Tom's older CPU reviews that still included the Athlon XP.
It looks to me like the "fsb factor" is about 5% performance hit for me, if I compare a 3ghz 200fsb P4 to a 3.06ghz 133FSB P4. For the Athlon XP it seems more severe, because I see a slightly larger difference going from one FSB to another and keeping the clock speed the same, but that's with a smaller difference in FSB, going from 166 to 200 not 133 to 200. I couldn't find any cases of equivalently clocked processors having more than 6% difference when going from a low FSB to a high FSB.
I also noticed that in many benchmarks I would not have been very likely to get the Athlon XP Mobile to beat the performance of the 3.06 Xeons, even if I were able to overclock beyond 2.4ghz. So I'm getting the Quadro and selling the Athlon.
that's some pretty interesting stuff. If his benchmarks can be trusted as accurate to the real performance of an FX4000 (even though it's modded to one) then that says my games performance won't suffer much at all. I'll have to load up RivaTuner when I get the card, to make sure it actually has all 16 pixel shader pipes enabled. I had no idea that utility could give you that much control over the card.
Possibly because people were modding Geforces into Quadros. Though, their biggest market is in the professional sector and that market doesn't even consider things like bios flashing and board soldering that could bring down stability, so they might not care much about that. If it's not because of the modding, it's probably to compete with companies like 3DLabs, who don't even make gaming cards at all and engineer their cards from the ground up to be insanely fast at professional OpenGL apps. (they even have some cards that do multiple-GPU SMP processing on the board!)
nVidia makes some of the best gaming cards out there, so maybe they are starting to realize that if they apply that engineering talent to making dedicated Pro chips, they could do the same for the OpenGL card market too.