I just signed up thinking maybe this community might shed some light on an email I sent Tom. I didn't get any response back from him, but hey, I'm sure he's a busy guy. I am just trying to gain some insight here and am truly trying to pose a question, as opposed to trying to bash anyone. I hope no one takes offense to anything I write below and I appreciate any response. Thx.
First of all, I would like to say "thank you" for maintaining a great site that has been a pleasure to visit. I don't have too many "hardware" sites bookmarked, but you have a spot on my Favorites.
I read a review about a month ago on another site (I apologize for not having the link) that benchmarked a P4-1.5GHz vs. an Athlon 1.2 w/DDR memory in regards to how it performed with the GeForce3. Up until that time, every benchmarking article I read (say within the last 6 months) comparing Intel's and AMD's latest offerings had the Athlon kicking Intel's butt on the tests I am interested in. (ie. general gaming & graphics). This article was the first where it actually had the P4 beating the Athlon in just about every benchmark, and most by a significant margin.
I have been looking into the PC marketplace to (hopefully) buy a new one within the next several months. Not ever having built one myself and not confident enough to try, I was leaning towards a vendor such as Falcon-Northwest. I've never bought from them, but am impressed with their seeming level of quality and attention to compatability and performance. So I wrote the folks there an email wondering if they were aware of these benchmarks and if they might potentially think about changing processors in the future with their GeForce3-equipped machines. This is all assuming this article was not a fluke. I never have heard back, unfortunately.
I'll admit I'm not a guru on all this stuff, and while I enjoy gaming and am a programmer by trade, I get a bit lost reading all the techy-worded articles as it relates to these new graphics chips and features. But what I can only speculate MIGHT be a reason why the P4 outperforms the Athlon (if it indeed does) is the following. It seems as these graphics units are becoming more and more powerful, more of the raw calculations required to go from mathematical data to rendered output is actually being performed by the graphics chip. As opposed to in the past where all (or most of) the calculations were done on the CPU and then the "final product" shipped to the graphics card to display. So it seems to me that with this latest nVidia chip, enough of these calcs are being done by itself that the advantage the Athlon has over the P4 in its floating-point performance is less of a factor and instead the P4's higher bandwidth becomes the mitigating factor as raw data is being shipped to these power-hungry graphics cards. Maybe I'm missing the boat entirely, but I would be interested in your opinion on whether or not we will see the P4 surpass the Athlon for GeForce3 users.
I'll admit, I had always been an Intel biggot, and only recently, after reading lots of articles on the Net, have I been changing my tune and been gearing myself up to go with an Athlon. I just don't know if this benchmark might just be the start of Intel closing the gap or taking the lead??
I think when gaming engines become SSE2 enhanced the P4 will prevail. The P4's enourmous bandwidth will also help because texture sizes are only getting bigger. When games like AquaNox and Ballistics are released- we'll see how it turns out. Right now, the current generation of gaming engines depend heavily on the FPU (which the Athlon has the strongest) and not on streaming instructions. This is probably why AMD has decided to adopt SSE2 into the Sledgehammer/Clawhammer line of processors. FPU strength will soon be a thing of the past when it comes to gaming.
So, my advice is to wait and see how new engines perform on a P4/GF3 vs a Athlon/GF3. Current games do not take advantage of the GeForce 3's technology.
Games do take advantage of the geforce3, maybe not all of the new instructions but at least all graphic applications will see huge preformance increase(dont chime in and split [-peep-] hairs please cause I dont give a [-peep-] about you getting 250FPS in MS word or Photoshop).
Running Quake 3 @ 1280x1024x32 with FSAA on and maintaining 120+ FPS is a huge step in performance.
Both P4 and AMD are optimized for geforce3 already.
Off topic: www.the-ctrl-alt-del.com has posted leaked deto 12 drivers. THERE IS A HUGE INCREASE IN PERFORMANCE FOR ALL NVIDIA PRODUCTS. enjoy
You should gain over 1000 points on madonion with new drivers = SWEET
No.... what I said is CURRENT GAME engines DO NOT take advantage of the GeForce 3- there is no possible way they can. Yes, games are faster- but they're not any better looking (except in FSAA). There is no game that takes advantage of per pixel shading or any of the new T&L enhancements. Currently- the GeForce 2 can only realistically do 4 lights at any given time- the GeForce 3 can do 8. All the GeForce 3 does for current games is give better FSAA (which no one really cares about) and more FPS.
Point and case- Aquanox can barely do 17FPS at 1024x768 on a GeForce 2 Ultra. The GeForce 3 does 40FPS no problem with Aquanox. Why? Certainly not because the resolution is high. It's because there are SOOO many technological enhancements to that engine that the only thing that can run it is the GeForce 3.
PS, thanx fer the link.
"Signatures Still Suck"<P ID="edit"><FONT SIZE=-1><EM>Edited by mpjesse on 04/16/01 08:48 PM.</EM></FONT></P>
Fugger, I must commend you for posting an *almost* unbiased message. Except for the last line about q3 but nobody is denying the p4 kicks ass in that. I'm being serious here. It's nice when we can all have a civilized conversation.
Actually, some games are a bit slower on a GF3 than the are on a GF2Ultra (surprise!). Some of the brute-force power of the GF2Ultra got toned down, and finesse got used in its place. That'll probably be nice when games can take advantage of the finesse, but if you were to freeze all game production now, the GF2Ultra would be the card to have.
<font color=red>"Step away from the gimp suit and put your hands on top of your head."</font color=red>
It seems the two chip companies (Intel and AMD) both push each other off the top of the hill every once in awhile. Right now, AMD's chips are the best you can get...But pretty soon you will see the P4's pick up steam, especially since Intel in cutting prices in half. BUT, AMD will come out with their Palomino processors which will push Intel off the hill again.
As for the GF3. I think it is going to be a hell of a card but it will take time. It depends if you don't want to upgrade again next year. Generally software and game developers are TWO YEARS behind the current technology. Which means if you upgrade to a GF3; you won't need to upgrade again for about 2-4 years. Or you could pay out some money now for a great average card but end up wanting to upgrade again next year and by then you would have paid more money than what you would've paid for the GF3. That is how nvidia makes their money; not off the new cards but off the people who are upgrading from 1-2 year old cards. (ie: Pay $350 now for a GF2 and in 1 year pay another $350 for a GF3 and you've payed $700 in 1 year instead of paying just $550 for a card that will last you a good 2 years if not more.
"Right now, AMD's chips are the best you can get...But pretty soon you will see the P4's pick up steam, especially since Intel in cutting prices in half"
I fail to see how price has anything to do with which CPU is best. Perhaps it affects which you buy, but not which is best. Currently, Intel's P4 is the 'best'. It's just expensive.
-- The center of your digital world --
April 18, 2001 3:38:07 PM
Thanks all for your insightful answers. I agree with much of what is said. I am curious, though, if anyone has seen any confirming or denying results to the one benchmark I cited. (damn, I wish I had the url for it...). I'm just curious that with the GeForce3 specifically, the only benchmark I have seen so far had the P4 beating out the Athlon, where normally it had been the other way around for quite some time. Anyone think this is a fluke? Or will prove to be true? And if so, why?
Welcome to the (programmers-who-are-not-hardware-gurus) club.
You might find the www.gamepc.com site interesting and their configurations useful as a frame of reference. Unlike some others, their configurator shows you exactly what components are included in a system.
I hope you get your answers. When these folks get back from the movies, I'm sure they will give you a lot of good information.
April 18, 2001 6:57:49 PM
Taking advantage of the average consumer by overpricing pisses me off. Don't buy from intel until they punish thier marketing department...