Sign in with
Sign up | Sign in
Your question

Does the latest P4 really outperform a Dual Athlon

Last response: in CPUs
Share
May 20, 2002 7:03:10 PM

I work in the graphics industry creating 3D models and animations. I am a firm believer in Dual CPU systems (for what I do) and love the performance we have achieved by building many Dual Athlon workstations and Render servers. I was surprised by the latest article about the P4/2533. In the Lightwave render bench it shows faster than Dual 1900+ machines benched in another article.

I have to spec and build 6 more workstations by mid-september. I had planned on them being Dual 2000+ (or latest) MPs on a Thunder K7 with 2-3GB DDR, IDE RAID, and a Wildcat 5110 Graphics card. (This system works well, we are currently running a SCSI Raid version).

Here is my question. If a single P4/2533 beats a Dual 1900+ MP, what counter move does AMD have to compete?
May 20, 2002 7:10:41 PM

Not all reviews are created equal, and the best way to compair benchmarks is in the same review or at least with as much similar hardware as is possable.

For some things, yes, a single P4 2533 will be faster, but so would an XP 2200+, however, many graphics programs do know how to use dual CPUs to the fullest. The only thing that hinders a dual Athlon system is limited memory bandwidth, which the P4 can easily outclass it in with Rambus ram.

"Search your feelings you know it to be true, I am your... twin sister" - Darth Vader
May 20, 2002 7:28:24 PM

Quote:
Here is my question. If a single P4/2533 beats a Dual 1900+ MP, what counter move does AMD have to compete?


To answer you question? Nothing right now but wait till Hammer to release.

I think AMD is going to have really hard time competing form next couple of Quarters. They will not get the preformance crown back until Hammer is released. And the preformance of the Hammer is still will be on par with or slighly more the Top of the Line P4 available.

About the benchmarks, you will find that the Lighwave benchmark use SSE2 which AMD processor doesn't have yet. More and more application will come out with the support for SSE2 which seems to boost P4 preformance a lot. And when selecting a processor please look at the bechmark close to what you will be using the machine for. It doen't make sense to look at Office productivity benchmark when the machine will mainly used for 3D games.

KG

"640K ought to be enough for anybody." - Bill Gates.
Related resources
May 20, 2002 7:38:26 PM

Thanks.
I was unaware that Lightscape was SSE2 optimized.

We primarily use 3DSMax for animations. We selected the Athlons due to the FPU performance since Max is not yet SSE2 optimized. In order to compare apples to apples, does anyone know what the exact 3DSMax benchmark used in the P4/2533 review was, and where I can get it? I would like to run them on our machines to compare.
May 20, 2002 7:48:45 PM

Quote:
Thanks.
I was unaware that Lightscape was SSE2 optimized.


Lightwave is faster on a p4 but not because of sse2, many people believe something foul is afoot from the fact a 1.3ghz celly on a 100fsb can beat a 1600+ axp with 266 ddr in that benchmark.

:wink: The Cash Left In My Pocket,The BEST Benchmark :wink:
May 20, 2002 7:50:55 PM

I second that. Newtek has some serious work to do, for not including AMD strings or whatever.

--
Luke, I am your father...but due to a bacon-slicing accident, your mother... :lol: 
May 20, 2002 9:37:25 PM

Intel and Newetek are pretty close, and Lightwave is extremely Intel optimized. Whether it's showing what the P4 is capable of, or is simply an unethical approach to software iptimizations, is for others to decide.

If you're using 3DSMax, then look for 3DSMax benchmarks. I would think that a dually Athlon system would perform a fair amount better in that app than a single P4 system, and probably not for a huge price premium, either (when talking about 1900+ and 2.5B).

<font color=blue>Hi mom!</font color=blue>
May 20, 2002 10:09:34 PM

i agree with the guy up there it couldnt be possible possible that the first gen P4 with L2 stripped in half beat the newest cpu with only 100fsb. i think intel have a deal with that company.
anyway if you are going to build workstation then choice whatever you like, both performing in the same range. BUT remember, different projects thus different useage of the cpu will differ the result you get.
May 20, 2002 10:09:58 PM

so the bottom line is test it yourself first
May 21, 2002 1:26:25 AM

Quote:
I second that. Newtek has some serious work to do, for not including AMD strings or whatever.

nope, AMD is the one sitting on their a$$es.



"<b>AMD/VIA!</b>...you are <i>still</i> the weakest link, good bye!"
May 21, 2002 1:55:39 AM

The programmers are the ones responsible for implementing the optimizations in their apps, not the CPU manufacturer. You can do better than that Melty. :wink:

:wink: <b><i>"A penny saved is a penny earned!"</i></b> :wink:
May 21, 2002 2:10:17 AM

LoL, I suppose you know all about coding, which is why we see so many apps coded by "AmdMeltdown."

Wait, we don't...

Somehow I doubt you know whereof you speak. You'd be wise to hold off on such comments until you've coded something even close to the same level as LightWave or 3DSMax. :tongue:

<pre>We now <b>return</b>(<font color=blue>-1</font color=blue>) to an irregular program scheduler.</pre><p>
May 21, 2002 2:42:22 AM

are we talking about 2 different apps? or just misusing the name? there is an app called Lightscape and an app called Lightwave, 2 different things. i'm not 100% sure (been out of the 3d world for a bit), but i don't think lightscape is sse2 optimized. Lightwave 7.0a and on is. you know what i think is sorta humorus about lightwave is the fact that it started on a powerPC (amiga), but yet newtek is doing all this stuff, it seems, exclusively for intel. i guess money makes up for lack of ethics :p 

[insert philosophical statement here]
!