Sign in with
Sign up | Sign in
Your question

well are people forgeting something

Tags:
Last response: in CPUs
Share
June 5, 2001 7:45:01 PM

Well the Athlon MP is going against L2 256kb p4 xeons. Most P2 and P3 xeon's sold with L2 512kb. Well i like to an Athlon mp far against and L2 2mb P4 xeon and a L2 512kb northwood.

Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:

More about : people forgeting

June 5, 2001 7:51:13 PM

Hey, I'll gladly slap all that hardware together and run some benchmarks, if you can provide the hardware. :wink:

Kelledin

"/join #hackerz. See the Web. DoS interesting people."
June 5, 2001 7:53:11 PM

lol. not bad well if there itanium vs. Athlon MP. Itanium would smash it to pieces.

Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
Related resources
June 5, 2001 7:55:13 PM

And then we'd have to include the Sledgehammer. Or has everyone already forgotten it?

Your Signature Sucks
June 5, 2001 8:04:28 PM

Im talking about today buddy. Hammer is in Q2 2002

Itanium is now. only the Alpha and the Sparc3 is able to take it on. IA64 is the next phase in desktop computing even AMD will switch to IA64.

Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
June 5, 2001 8:10:54 PM

great I can get an AthlonMP at PC progress. And I can get a Xeon 1.7 at Acme Micro. Where are they selling Itaniums?? Or did you mean a couple of months by "today"?


Your Signature Sucks
June 5, 2001 8:14:50 PM

Desktop computing... yet sooo far away even from power users. I want an analysis of the price/performance ratio on that please.

"He who laughs last doesn't get the joke"
June 5, 2001 8:18:57 PM

Also there wasn't even a benchmark for dual 1.4 normal T-Birds. Tom's article stated that they wouldn't run with differently clock T-Birds. There was no mention of dual 1.33's or 1.4's.

Your Signature Sucks
Anonymous
a b à CPUs
June 5, 2001 8:34:18 PM

there is no dual northwood yet...
Anonymous
a b à CPUs
June 5, 2001 8:51:30 PM

>Itanium would smash it to pieces.

Maybe
Maybe not. Particularly on 32 bit apps (but then, why buy a 64 bit CPU for 32 bit apps...)

In any case, we're not likely to see much on Itanium from Tom, Anandtech, aces, etc., until Q4, because these sites generally aren't interested if it <A HREF="http://www.zdnet.com/enterprise/stories/linux/0,12249,2..." target="_new">doesn't do Windows.</A>

<i>Cognite Tute</i>
(Think for Yourself)
Anonymous
a b à CPUs
June 5, 2001 8:58:05 PM

64bit cpu's aren't going to suddenly take over, if technology can't do what todays technology can and do the new stuff it will have a hard time being adopted right away.
June 5, 2001 9:15:22 PM

Oh and let's not forget price the Athlon MP's are selling for around $250 while the best price for a Xeon 1.7 is $600! Now I work in 3D Studio Max R3 for presentation models, and that benchmark makes my mind up very quickly.

Your Signature Sucks
Anonymous
a b à CPUs
June 5, 2001 10:05:51 PM

Whoops, but it'll have to be A4 then, and we'll have to see how it does.........

Aklein

It's raining outside, and my lawn has grown a foot overnight!
Anonymous
a b à CPUs
June 6, 2001 2:30:08 AM

Errr....kelder...someone who is still learning to create new folders shouldn't be worried about 64 bit units. Once you learn to create a new folder then maybe you can move onto wordpad....and then MSpaint...

"Cock-a-doodle-do" is what I say to my girl when I wake her UP in the morning!!
June 6, 2001 3:21:44 AM

Oh grow up. Do you actually know Kelder? You sound like a little kid with nothing better to respond with so you start throwing out comments you know aren't true. Your comments lately are about the equivelent of "Oh yeah, well I screwed your mom last night.. so there."

Grow up!
Anonymous
a b à CPUs
June 6, 2001 3:37:56 AM

Bwahahahaha.....did your boyfriend squeeze your testicles a little to hard? Man, what are you freakin about? Besides Kelder knows exactly what I'm talking about. So lame off.

"Cock-a-doodle-do" is what I say to my girl when I wake her UP in the morning!!
June 6, 2001 3:42:06 AM

"Bwahahahaha.....did your boyfriend squeeze your testicles a little to hard?"

Thankyou for proving my point.
Anonymous
a b à CPUs
June 6, 2001 4:04:58 AM

That's the way...at least your attempting humour now. Sounded like you were wearing iron undies before.

"Cock-a-doodle-do" is what I say to my girl when I wake her UP in the morning!!
Anonymous
a b à CPUs
June 6, 2001 4:22:43 AM

yeah, I know your a moron and can't accept when your wrong. You scan for any post I make and post some crap like that, you need a life man. It is actually funny how everyone knows your a retard that can't help but post imature insults that are just plain laughable even though you are trying to make people feel bad. Grow up already
Anonymous
a b à CPUs
June 6, 2001 4:30:40 AM

Ok then M_kelder tell us all why somebody like you would need a 64 bit cpu? I know and you know that you only use your computer for menial stuff (even though you did once pretend to be an animator then retracted that statement) and that you bagged the 3D labs cards but failed to post any critisisms when faced with strong technical opposition.

So then M_kelder tell us all why somebody like you would need a 64 bit cpu?

"Cock-a-doodle-do" is what I say to my girl when I wake her UP in the morning!!
Anonymous
a b à CPUs
June 6, 2001 6:54:34 AM

Read the [-peep-] posts moron, I didn't say [-peep-] about me wantinga 64bit cpu, your such a moron, do you ever read the posts? I showed the benchmarks for the 3dlab card vs geforce and firegl cards, but you must have forgotten or you mind blocked you out because you can't handle being wrong!

You don't even know me mother fucka! I never ever posted saying, "I am a professional animator, I know all, listen to me."

Have a nice day :) 
Anonymous
a b à CPUs
June 6, 2001 8:17:12 AM

I originally said..."Errr....kelder...someone who is still learning to create new folders shouldn't be worried about 64 bit units. Once you learn to create a new folder then maybe you can move onto wordpad....and then MSpaint..."

So don't concern yourself with what is irrelevant to your life. You have no need for fast cpu's, quick hardrives or professional 3D cards. You surf the net and play games. And you did once tell us all you were an animator. Then when you were asked a couple of basic questions you avoided the post whilst dumping your garbage thoughts at other posts. Just because you couldn't justify yourself earlier doesn't mean you ought to get grumpy.

So kelder why would somebody like you need an ultra quick machine? Would profesional purpose would you put it to? Why are you even here? Shouldn't you be learning to rescale fonts or something like that?


"Cock-a-doodle-do" is what I say to my girl when I wake her UP in the morning!!
Anonymous
a b à CPUs
June 6, 2001 9:06:19 PM

"You have no need for fast cpu's, quick hardrives or professional 3D cards. You surf the net and play games. And you did once tell us all you were an animator. Then when you were asked a couple of basic questions you avoided the post whilst dumping your garbage thoughts at other posts"

where's that fool?
June 6, 2001 9:31:35 PM

Hey, I play games, surf the internet (like you) and I am no professional artist or engineer but I am still interested in 64 bit stuff and performance machines. Who says that I shouldn't try to improve my understandings of advance professional equipment? Seems like it is the gamers that are pushing the high end market anyways, especially in graphics and programming. So I don't understand your logic at all. What do you do for a living Tonestar? Are you a 3d Artist? If so what steps do you see someone taking to become proficient in 3d artwork? What equipment do you have and what do you recommend and why? I've notice that virtually everything m-kelder has said is true particulary about T-Birds, dual processing and 3dsmax. The recent benchmarks all prove he was correct when they where not even available when the topics where discussed in detail. Do you have some samples to show us of your artwork?

Well to eat your <b>C :smile: :smile: kie</b> and have it too, gotta get <b>Rade :smile: n II</b>
Anonymous
a b à CPUs
June 6, 2001 9:45:23 PM

No he isn't a 3d artist, he doesn't even have a clue what it is about as he told me that knows hardware and thus would be the best animator ever. Lol tonestar, when you said that it proved you have no clue what so ever so you should stop trying to attack me. As you see, others have read the threads we've had and read that I show all that benchmarks and tried to explain them to you but you are too ignorant.

The only professional animator that I know of that had come here said that I was right about the tbirds and he is the one that told me how the 3dlabs cards that cost $700 now don't perform better than a geforce 2. I've seen and showed benchmarks that back that up. Unlike you, he really knew what he was talking about.
June 6, 2001 9:59:28 PM

Is he the one that said that the mx was virtually as fast as a GF2 in 3dsmax which appears to be 100% correct? I liked that guy, wished he would return and give us more good info.

Well to eat your <b>C :smile: :smile: kie</b> and have it too, gotta get <b>Rade :smile: n II</b>
Anonymous
a b à CPUs
June 6, 2001 10:01:10 PM

yeah, that was him. Glad he told me that or I would have bought a card that performed the same for $200 more.

I think I have his contact on the other hard drive. I can pick up my hardware tmw!
June 6, 2001 10:06:23 PM

Well i wonder how a dual chip Radeon 2 will works in 3d studio max. Like i say i think the Radeon 2 maxx will be called the FireGL 5.

Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
Anonymous
a b à CPUs
June 6, 2001 10:15:06 PM

Yikes, that'd be the machine to have. I'm going to have to start saving up now. If ati could have the tessalation through the gpu sent back to 3dsmax that'd be cool, athough the bandwidth from agp to cpu would be just as slow as from ram to cpu anyway.

Oh and sorry for crapping your thread, tonestar just doesn't care where he picks his fights... I think every post I've made this week has had him reply something stupid like "your still learning to make folders" What a goof.<P ID="edit"><FONT SIZE=-1><EM>Edited by m_kelder on 06/06/01 06:18 PM.</EM></FONT></P>
June 6, 2001 10:18:06 PM

Maybe it will be able to support 8x AGP

Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
June 6, 2001 10:19:03 PM

Well as interesting as it sounds the first dual GPU offering of ATI was a miserable fialure to the point they had to disable 85% ( my estimate) of the second GPU.

A little bit of knowledge is a dangerous thing!
June 6, 2001 10:22:29 PM

Well i think the Radeon 2 maxx will be like this. If the first cant take the load the 2nd gpu will help it out. If not 2nd is doing nonthing. Should emilate bottlenecks of the Radeon 2.

Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
Anonymous
a b à CPUs
June 6, 2001 10:24:32 PM

What discreet and other 3d apps makers out there need to do it team up with video card manufacturers and make pci cards that is responsible for one viewport, although I can already see a problem with that as the cpu would get too big of a work out sending everything to each card via pci. A better thing would be a agp card that has seperate gpu's for each viewport. Each gpu could have it's own memory bank as well as a larger shared one. I think that'd work great though I'm no electronics designer.
June 6, 2001 10:37:20 PM

What would work better:

-sharing 1 big 64mb of ram

-having 2 seperate 32mb of ram for each gpu

-having 2 seperate 16mb of ram for the gpu's and also sharing 32mb.

Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
Anonymous
a b à CPUs
June 6, 2001 10:47:49 PM

They could probably do better each having 32mb, they could closer tie it ot the gpu but then again I think they'd need someway to communicate to each other. Hell, just make a card that has a gpu with 64mb cache! Would be the size of the chip but would run fast :) 
June 6, 2001 11:20:56 PM

HUH? Why not just use two video cards? Actually this is were the nforce starts to get interesting. Rumour has it that It can support two AGP (one internal and one external). This is also further fueling specualtion ona possible SMP soluiton in the future with this chipset as the internal AGP could theoretcally be substituted for another CPU channel.

<A HREF="http://images.anandtech.com/reviews/chipsets/nvidia/nfo..." target="_new">http://images.anandtech.com/reviews/chipsets/nvidia/nfo...;/A>

A little bit of knowledge is a dangerous thing!
Anonymous
a b à CPUs
June 7, 2001 2:13:11 AM

Oh kelder, for gods sake....what do we have to do to make you believe it. The article titled "Professional affair" in THG under "Graphic Guide"...do you not believe the reviewer? He clearly pionts out the difference in ground between the geforce cards and the pro cards. Visit any 3D or Cad site and have a look at their reviews, some even include the games cards to clear up the myths that people like you have come to believe.
The reason you can live with a games card is because you don't create complex models or scenes, if you did you'd realise that it takes forever to move and manipulate these objects.
Your still trying to pretend you know more than the reveiwer. If the geforce cards were just as good nvidia would rename and repackage the same card, but even in their own range they have modified the card. Why bother if the geforce's are just as good? Why then do the Quadro's benchmark better? You are soooooo stupid!!!

If anyone would like to see what kind of a deceitfull little bitch kelder is go to a post called "Graphic designer computer(help)" posted by maiden_hell in the CPU forum. See:

http://forumz.tomshardware.com/modules.php?name=Forums&...

The conversation gets onto profesional graphics cards, and kelder is unable to defend his position in gaming cards so decides not to return. How pathetic. How funny!!!
So kelder...does it make you feel good to come here and preach to some ignorants? You don't know what your talking about so keep your cock suckin mouth shut!!!

Gee, what does it take to get through your thick skull?

"no kelder....you can't shoot goblins in wordpad"
June 7, 2001 2:33:40 AM

What are you talking about?? Seems like m_kelder picked out the best performance for the buck anyone could purchase for 3dsMax. Check out this report about 3dsMax and different video cards:

<A HREF="http://www.xbitlabs.com/video/3dmax/index2.html" target="_new">http://www.xbitlabs.com/video/3dmax/index2.html&lt;/A>

By the way how much do those professional cards sell for? and which ones are you talking about?

Well to eat your <b>C :smile: :smile: kie</b> and have it too, gotta get <b>Rade :smile: n II</b><P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 06/06/01 10:34 PM.</EM></FONT></P>
Anonymous
a b à CPUs
June 7, 2001 2:34:40 AM

we wuz just suposen

I'll have to do some testing but I'm sure a tripled my performance from my older system for sure.<P ID="edit"><FONT SIZE=-1><EM>Edited by m_kelder on 06/06/01 10:36 PM.</EM></FONT></P>
Anonymous
a b à CPUs
June 7, 2001 2:37:18 AM

"titled "Professional affair" in THG under "Graphic Guide"
is over a year old moron!
It is outdated and obsolete

"The reason you can live with a games card is because you don't create complex models or scenes, if you did you'd realise that it takes forever to move and manipulate these objects."

Just look at some recent benchmarks and you'll see that when it comes to high poly that grinds a system to a halt both the geforce and any 3dlabs card performs the same!

"Your still trying to pretend you know more than the reveiwer. If the geforce cards were just as good nvidia would rename and repackage the same card, but even in their own range they have modified the card. Why bother if the geforce's are just as good? Why then do the Quadro's benchmark better? You are soooooo stupid!!!"

Ummm, the quadro has no changes over a geforce besides clock! The 'only' reason a quadro outperforms a geforce is the driver. Nvidia disables many things for the geforce so it doesn't perform quite as well as a quadro. Obviously you don't know what you talking about and I've researched it...

Oh and about the other thread, I told him to wait for the northwood and dual athlons to come into retail, what is bad about that? And I told him that he would be better off buying a geforce2 instead of a 3dlabs card because there is hardly a difference and a huge difference in price.

Sit down tonestar<P ID="edit"><FONT SIZE=-1><EM>Edited by m_kelder on 06/06/01 10:45 PM.</EM></FONT></P>
Anonymous
a b à CPUs
June 7, 2001 2:42:23 AM

Oh noko, you have no idea do you? Did you read that article in THG? I think it was in may 2000.

All the cards in that test are games cards. All of them. Have a look at any of those cards compared to 3Dlabs, Elsa or FireGL cards...they don't even remotely compare.

Yes, these cards are often alot more expensive but for a REAL designer they are essential. Thats why any real person can pick kelder for a fraud from a mile away.

And noko...why not read my links before you post back crazy stuff?

"no kelder....you can't shoot goblins in wordpad"
Anonymous
a b à CPUs
June 7, 2001 2:44:51 AM

You freakin dumb ba&stard...have a look at these cards side by side before you keep spilling garbage...

"no kelder....you can't shoot goblins in wordpad"
Anonymous
a b à CPUs
June 7, 2001 2:48:00 AM

lol man, I've showed you benchmarks before! Don't you remember or are you just stupid? I'll post them again, guess your thick!

Oh and thanks for putting me in your sig, shows how little of a life you have that every day you try and attack me some way or a nother.
June 7, 2001 2:54:20 AM

Can you show us some of this professional work that you do?
Anonymous
a b à CPUs
June 7, 2001 2:57:18 AM

Here we go...
These nubmers were given to me at the discreet web board when I asked which card I should get. These people know many times more about this than anyone here just to let you know.

firegl2/firegl3/Oxygen VX1PRO/Oxygen GVX420/Gloria III/Geforce2
numbers in that order

raster*******************/77.5/77.5/16/37.7/100.2/85.5
geo2(high poly)**********/4.4/4.4/2.1/2.2/2.5/2.4
wireframe(lower poly)****/11.8/11.9/5.7/6/7.6/4.4
4views*******************/12.5/12.6/4.8/5/7.8/3.7
light1*******************/74.6/72.7/4.5/10.9/50.7/41.8
light2*******************/74.5/72.9/20.7/31.3/52.1/51.2
light3*******************/78.8/72.9/13.1/15.9/52.7/51.3
text1********************/68.7/70/???/49.2/114/81
text2********************/41.7/42.5/10.4/18.2/42.7/40.5
wheu, that took too long :) 

As you see tonestar, I speak from benchmarks, you just speak from your ass.
Anonymous
a b à CPUs
June 7, 2001 2:59:02 AM

he doesn't do it, he never said he did. Read this thread again and you'll see what he said to me before, it is quite funny.

Oh, and by the way tonestar, dumbest is slang which implies your stupidity, heh, everything you've said so far just makes you seem more and more retarded, keep it up!<P ID="edit"><FONT SIZE=-1><EM>Edited by m_kelder on 06/06/01 11:02 PM.</EM></FONT></P>
Anonymous
a b à CPUs
June 7, 2001 3:02:43 AM

Ok kelder...cos your such a freakin mule and you never listen to me I am going to quote some other people:

Kelledin:
For 3D, the GeForce series cards are gaming cards. The Quadro series cards (beefed-up GeForce series chips) are still "wannabes" in the professional 3D world. Cards like the Elsa Gloria series cards and the Permedia series chipsets (used in the Diamond FireGL cards) still trounce the Quadro for professional 3D (geometry processing power is more important than fill rate in this scenario).

Makaveli:
Don't listen to these children The Geforce 3 is far from a professional 2d/3d for graphic design. What I would suggest you do is list the software you will be using, and perphaps if you know anyone in the same field find out what they are using and how it works for them. Don't listen to these kids poor advice. When it comes to professional Video cards for that kinda of usage Nvidia ain't smack! Next time u post a topic like this ask for responses from people in the same profession, there advice will be far more valuable to you.

I can't speak for everyone but there are alot of children on this forum, and they sometimes give alot of uneducated advice and opinons.

Makaveli was refering to you idiot. This is what people have to do to try to tame an idiot like you.

!!!!BANG!!!!! your dead.....

"no kelder....you can't shoot goblins in wordpad"
Anonymous
a b à CPUs
June 7, 2001 3:08:26 AM

In reply to Kelledin's post, he failed to mention that the only cards that beat nvidia chips are wildcats and firegl's which are in the $1000's and if you can afford them then I have allways said go right ahead.

To Makaveli post, the geforce 3 hasn't fallowed nvidia's previous release in that only brute force is what gamers want, which is what 3d artists want.

Glad I could clear that up for you tonestar

Oh and I remember where the guy that gave me those benchmarks told me those benchmarks came from. They were in a 3d artist mag but can't remember which one so it isn't heresy.<P ID="edit"><FONT SIZE=-1><EM>Edited by m_kelder on 06/06/01 11:10 PM.</EM></FONT></P>
Anonymous
a b à CPUs
June 7, 2001 3:13:51 AM

Wow kelder thanks for that...look at those numbers!!!

The pro cards certainly have a huge difference in the geometry, wireframe and lighting tests don't they? And like I have attempted to explain to you, this is the way people design, they don't render every couple of seconds...negating the "work time" application of the raster and texture marks. Hardware shading doesn't make it up against raytracing, therefore demising these marks again.

Anyone who knows what the marks are all about must be crying from laughter...oh kelder...talk about incriminating evidence!!

What a fool you are, you even proved yourself an idiot....thank god your not a doctor!!

Suck the whopper!!!

"no kelder....you can't shoot goblins in wordpad"
Anonymous
a b à CPUs
June 7, 2001 3:18:17 AM

Are you delirious? I think you need to read those numbers again because you don’t seem to understand that the oxygen cards don’t perform better.

Denial is a natural defence of your brain to protect you from things that hurt you so it is ok that you said that tonestar.
!