Sign in with
Sign up | Sign in
Your question

why not mb support for 2-8 cpu`s?

Last response: in CPUs
Share
February 24, 2005 3:56:31 AM

i mean it would be pretty wickid if i could pop like 4
pentium 4 2.4 ghz chips in one board
they are cheap and my system would be super fast right?

hear me now, believe me later, trust me in between
i`m a cop you idiot
i`ll be back

More about : support cpu

February 24, 2005 7:08:06 AM

Thats why Intel and AMD invented Xeon and Opteron.
February 24, 2005 7:24:22 AM

these boards exist, but they are anything but cheap :) 

Furthermore, what (desktop) apps exactly do you think would benefit from 4 or 8 slow cpu's ?

= The views stated herein are my personal views, and not necessarily the views of my wife. =
Related resources
a b à CPUs
February 25, 2005 4:46:50 AM

Quote:
they are cheap

That's exactly the reason you can't do it. Intel makes the big bucks off server products, they don't want people building servers with cheap P4's. So they made 2 versions of every P4 core, the desktop version and the Xeon. The desktop version is intentionally handicaped to prevent it from being used in dual configurations.

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
February 25, 2005 11:52:57 AM

Xeons @2.4 GHz aren't that much more expensive. They list on pricewatch for $159. However, what really *is* expensive is designing and building motherboards that accomodate 4 or more cpu's, and the same for chipsets that can handle them. Maybe not so much for K8 chips (HTT), but definately for traditional chips with high speed FSBs. These boards are often 12+ layers.

Just want to show that the price difference between 2/4/8 way systems and single cpu systems isnt purely marketing.

= The views stated herein are my personal views, and not necessarily the views of my wife. =
a b à CPUs
February 25, 2005 3:47:20 PM

Doesn't Intel have different versions of the Xeon that are limited to 2 CPU's max and 4 CPU's max? With the version that supports quad config also supporting dual config but not vice-versa?

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
February 25, 2005 5:43:31 PM

In reply to:they are cheapThat's exactly the reason you can't do it. Intel makes the big bucks off server products, they don't want people building servers with cheap P4's. So they made 2 versions of every P4 core, the desktop version and the Xeon. The desktop version is intentionally handicaped to prevent it from being used in dual configurations.

someone should come out with a software/hardware fix to cheat intel

hear me now, believe me later, trust me in between
i`m a cop you idiot
i`ll be back
February 25, 2005 6:46:23 PM

arnold873, your idea is just nuts.

Even if you <i>could</i> make a P4 run like a Xeon, mobo manus would be awfully hard pressed to design a mobo to support <b>eight</b> P4s that didn't cost a fortune.

<i>Even then</i> however, one must still ask the question of how they access memory? Would there be 8 memory banks? Would there be 4 shared memory banks? Would all eight share one memory bank and fight each other for bandwidth like little old ladies caning each other with their walkers for the last can of Metamucil?

<i>And even <b>then</b></i> how do you think that you would power this beast of a machine? You'd need either multiple PSUs or one hell of a big single PSU.

<i>And even <b>then</b></i> how would you <i>cool</i> this monstrosity of a box with <i>eight</i> CPUs?

Sorry, but your idea is just nuts. To <i>safely</i> and <i>effectively</i> do this you might as well cough up the money for an actual server like any educated person would do. Otherwise, even if you could get it to run (which I doubt) it would probably run like crap and still cost a fortune anyway.

Basically, you're better off buying a dualie AMD workstation than you are trying to build an eight P4 monstrosity.

<pre>Antec Sonata 2x120mm
P4C 2.6
Asus P4P800Dlx
2x512MB CorsairXMS3200C2
Leadtek A6600GT TDH
RAID1 2xHitachi 60GB
BENQ 16X DVD+/-RW
Altec Lansing 251
NEC FE990 19"CRT</pre><p>
February 25, 2005 7:49:37 PM

Chk out Tyan's site for AMD. They do 2, 4, and 8 way systems. Well, there were rumors of 8 way systems anyway. Windows XP supports up to 2 processors btw. Same with Windows 2000 (edit: someone correct me if i'm wrong here). So you'd have to get a copy of Windows 200x Server, with a version specific to the amount of cpu's you wanted to use. Or go unix/linux, and they have versions specific to the amt of cpu's you'd like to use as well. We have a server where i work running 2003 server with 4 cpus in it. 2 physical 2 ht. It was about an $8000 server. Whats funny, is that it doesn't do anything. But we needed to stay in standard with the other sites, so we got it. What a joke.

edit: for clairifitcation...it doesn't do any work because there aren't any server/management programs running on it...and it isn't currently being used for file storage (for which it was mostly intended)

Current machines running F@H:
Athlons: [64 3500+][64 3000+][2500+][2000+][1.3x1][366]
Pentiums: [X 3.0][P4 2.4x5][P4 1.4]

It's not worth saying unless it takes a really long time to say!<P ID="edit"><FONT SIZE=-1><EM>Edited by apesoccer on 02/28/05 09:39 AM.</EM></FONT></P>
February 25, 2005 11:21:07 PM

Yes, as does AMD and anyone else. Truth be said, Xeon MP's (>2 way) generally also have a lot more cache than uniprocessor or 2 way capable Pentiums/Xeons, which explains a bit of the pricing difference.

My point was just, that it isn"t because of such opportunistic pricing strategy by intel (AMD,..) that 4/8 way systems are so expensive. In fact, the price of the cpu's is pretty small compared to the system as a whole, especially if you ignore the highest speedgrades.

Its simply expensive to produce such motherboards, and its difficult (hence expensive) to design the chipsets. Of course, low volume doesn't help either; so even with ~$150 Xeons, it makes no sense for a desktop user to get a 4 way machine, especially since the performance boost (for desktop apps) just won't be there.

Lastly: if "sockets" wheren't so expensive, there would be not much point in making multicore cpu's, would there ?

= The views stated herein are my personal views, and not necessarily the views of my wife. =
February 26, 2005 4:28:48 AM

[Chk out Tyan's site for AMD. They do 2, 4, and 8 way systems. Well, there were rumors of 8 way systems anyway. Windows XP supports up to 2 processors btw. Same with Windows 2000 (edit: someone correct me if i'm wrong here). So you'd have to get a copy of Windows 200x Server, with a version specific to the amount of cpu's you wanted to use. Or go unix/linux, and they have versions specific to the amt of cpu's you'd like to use as well. We have a server where i work running 2003 server with 4 cpus in it. 2 physical 2 ht. It was about an $8000 server. Whats funny, is that it doesn't do anything. But we needed to stay in standard with the other sites, so we got it. What a joke.]

so i guess the extra cpu`s did nothing for performance?
i was just looking for a good gaming pc that i didn`t have to upgrade for like 2-3 years


hear me now, believe me later, trust me in between
i`m a cop you idiot
i`ll be back
a b à CPUs
February 26, 2005 4:35:18 AM

Have you seen some of the new nForce4 boards? 2, 3, or 4 northbridges supporting extra PCIe lanes and so forth, it would seem like nearly any company could make a 4 single CPU, 4 chipset northbridge, 4 sets of RAM board, so long as the chipset supported it.

The boards could become insanely expensive of course!

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
February 26, 2005 10:36:41 AM

You are correct on the OS issue. For 2 processors, you can't use XP Home, you need XP Pro (or Windows 2000 PRo). For 4 CPU4s you'd need Windows server 2003, and for 8 if I'm not mistaken even the enterprise version. However, I'm unsure how windows will cope with multicore cpu's

>so i guess the extra cpu`s did nothing for performance?
>i was just looking for a good gaming pc that i didn`t have
>to upgrade for like 2-3 years

For gaming, better spend money on the fastest CPU (only one) and a better videocard (or two in SLI). There is not a single game out there today that benefits from more than one cpu, and I do not expect this to change significantly during the life time of your machine.


= The views stated herein are my personal views, and not necessarily the views of my wife. =
February 27, 2005 3:56:33 AM

the only reason i brought this up is i figured there could be a software or hardware solution that would make really fast and cheap computers for people that wanted them.
but i guess in order for all of it to work several hurdles would have to be overcome, this seems to have not been to big a problem for video ( sli technology ) so why would it be for the cpu? someone just needs to compile some software and the hardware needs to be developed. I would think that the data could be compartmentalized the same way that you have a gpu and sound processor, anotherwords have the inexpensive cpu`s work on data between the gpu or from the hd`s to take a load off the rest of the system, even though only a part of the cpu will be utilized the sum of all these cpu`s will be great. as far as expense or complexity of the MB i think plenty of people will buy it just to screw around with it to see what they could do. a pentium 3 266 goes for like what 20$ and would take what 5% of the load the audigy puts on your cpu for sound processing. the same with your gpu, a 533 could be used to control the hdd, floppy, buses and ports ect, again taking some of the load off the main cpu.

hear me now, believe me later, trust me in between
i`m a cop you idiot
i`ll be back
February 27, 2005 4:29:10 AM

Quake 3 and Doom 3 support dual cpu's, I believe some others are also.
The 2000/XP os is multithreaded, also is DirectX, which dual cpu's help performance along with dual cpu's help background apps running.
Also, dual cpu systems can have the os running on one cpu, while the game is running on a seperate cpu giving you the full power of that processor. Usually doesn't add up to much, less than 5%.


<pre><font color=red>°¤o,¸¸¸,o¤°`°¤o \\// o¤°`°¤o,¸¸¸,o¤°
And the sign says "You got to have a membership card to get inside" Huh
So I got me a pen and paper And I made up my own little sign</pre><p></font color=red>
February 27, 2005 4:46:20 AM

what i`m saying is it should be forced in a hardware solution somehow. how can hundreds of computers at the same time work on a problem over the internet and the information makes sense?

hear me now, believe me later, trust me in between
i`m a cop you idiot
i`ll be back
February 27, 2005 3:00:29 PM

>but i guess in order for all of it to work several hurdles
>would have to be overcome, this seems to have not been to big
> a problem for video ( sli technology ) so why would it be
>for the cpu?

For 3D graphics, its easy. Rendering is highly parallel by its very nature, its not a problem to let one videocard render half the screen, and the other the other half. (In fact, this has been done by 3DFX (Voodoo2) ages ago. in fact bis, if you look at a modern GPU, its extremely parallel itselve). Same goes for data storage (RAID is a similar concept).

Now, for game engines (and many other apps), its either hard, or sometimes impossible to spread the workload over different CPU's in an effective manner.

Consider this: say your workload is to mail 2000 letters. If that takes you one day (put them in enveloppes, add stamp,write address..), 2 people will do it in half a day, and 12 people in an hour. This is 3D graphics.

Now consider you have to <i>write</i> the letter, and it takes you a day. How long will it take 12 people do you think ? Are you going to let everyone write a single paragraph ? Thats gonna be one weird letter...

In theory there should be some tasks in a game engine that can be threaded, AI, physics and 3D modelling come to mind. In practice, apparently there is so much need for communication between those threads, that the benefit just ain't there. At least not yet.

= The views stated herein are my personal views, and not necessarily the views of my wife. =
February 27, 2005 3:04:58 PM

>what i`m saying is it should be forced in a hardware solution
> somehow. how can hundreds of computers at the same time work
> on a problem over the internet and the information makes
>sense?

Again, because the workload is highly parallel by its nature. Take SETI, of course every computer can calculate its own WU, there is no need to communicate between one computer and the other (besides aggregating the data again). Similary, you could spread rendering a movie over several computers (each does specific frames), video editing, etc, all linear workloads. Now try running an AI algorithm where one computer doesn't know what the other is doing.. I promise you some odd behaviour :) 

= The views stated herein are my personal views, and not necessarily the views of my wife. =
February 27, 2005 3:10:29 PM

>Quake 3 and Doom 3 support dual cpu's, I believe some others
>are also.

Yet AFAIR, Q3 actually ran *slower* in dual CPU mode than in single CPU mode (on a SMP machine).

>Also, dual cpu systems can have the os running on one cpu,
>while the game is running on a seperate cpu giving you the
>full power of that processor. Usually doesn't add up to
>much, less than 5%.

Well that is the point, isn't it ? If you are playing a game, it will require 95+% of your CPU time, the OS (or even DirectX) isn't doing anything significantly by comparison.

Now if you are running multiple threads (or apps), like a CPU intensive background app, of course more CPU's (cores) help. Even HT helps there. But within a single game, it has yet to be shown there is any benefit from having multiple cpus/cores.

= The views stated herein are my personal views, and not necessarily the views of my wife. =
February 28, 2005 12:14:17 AM

so the AI could be done by one cpu,
the physics by another
and 3d modeling by yet another
so why would this not have a cumulitave effect?

hear me now, believe me later, trust me in between
i`m a cop you idiot
i`ll be back
February 28, 2005 12:18:42 AM

again if cpu`s were set in parallel as opposed to a task oriented way as i propose. the cpu`s could build on previos cpu`s work? anotherwords one cpu would ready the work area with pen paper etc. the second would brainstorm, the third would do a rough draft, the forth would do a draft and the fifth would proofread and finalize.

hear me now, believe me later, trust me in between
i`m a cop you idiot
i`ll be back
February 28, 2005 12:16:30 PM

>one cpu would ready the work area with pen paper etc. the
>second would brainstorm, the third would do a rough draft,
>the forth would do a draft and the fifth would proofread and
>finalize.

Well, you have to do the brainstorming before the draft, and you can't proofread unless something is written. IOW, you're not winning any time ! Also consider intercpu communication is order of magnitudes slower as intra cpu communication. So unless you can ask another cpu to do a sizeable, independant chunk of work, it would be faster to do it all on one cpu. To translate that to our example, those 12 employees would be spread aroud the globe with only ground mail as means of communication.

So, sure, one cpu could do AI, the other physics, but the AI decides for instance where to shoot, and you can't calculate the physics/ballistics until the AI has finished. And calculating the ballistics of that bullet is quite possibly done much faster than sending that data to another cpu and wait for the result. Highly simplified example again, but you might get the point.

= The views stated herein are my personal views, and not necessarily the views of my wife. =
February 28, 2005 1:56:27 PM

One more thing...

In SLI, i read somewhere that they said they were maxing out the CPU even with amd 4000s, instead of maxing out the gpus first. In every system there are bottle necks...and for the first time, the bottleneck is supposedly the cpu. What this could mean (i'm just throwing this out there...i don't know how true it is), is that in a SLI formation, where 2 different gpu's are taking care of data, a dual (core?) cpu might be more useful. If cpu0 is taking care of gpu0 and cpu1 is taking care of gpu1, we have the possiblity of increasing our ultimate fps. Now this idea is reliant on several things...one, that the drivers for the gpus are designed to be able to send information to multiple cpus; (which isn't likely to happen in the near future, if at all)...two, that we aren't maxing out our memory; ie, there isn't a bottleneck at the memory; Furthering that idea...if memory is a bn, then could we instead of using 2 cpus on the same chip, could we go by way of the multi xeon's and opterons, with memory allocated to each cpu and see a difference that way... ok this has gotten long enough...

These of course are all questions that we can't answer since most of us are limited by the amt of cash and or time on our hands. But it would be interesting, for me anyway, to see these kinds of questions answered [go go gadget Tom write an artical].

Current machines running F@H:
Athlons: [64 3500+][64 3000+][2500+][2000+][1.3x1][366]
Pentiums: [X 3.0][P4 2.4x5][P4 1.4]

It's not worth saying unless it takes a really long time to say!<P ID="edit"><FONT SIZE=-1><EM>Edited by apesoccer on 02/28/05 10:20 AM.</EM></FONT></P>
Anonymous
a b à CPUs
February 28, 2005 5:23:30 PM

That would make SLI more attractive, lets hope SOMEBODY does an article on it :smile:

<font color=green> Woohoo!! I am officially an <b> Enthusiast </b>!! </font color=green>
February 28, 2005 5:48:20 PM

>s that in a SLI formation, where 2 different gpu's are taking
> care of data, a dual (core?) cpu might be more useful.

Nop. A faster CPU would be usefull, but 2 cpu's won't change a thing as long as the game engine can not benefit from it. Furthermore, I'm not too intimate with todays SLI, but I assume the rendering is divided over the two cards (each doing a set of lines for instance). Having each GPU being "served" by a cpu is not going to work/help.

= The views stated herein are my personal views, and not necessarily the views of my wife. =
Anonymous
a b à CPUs
February 28, 2005 6:55:58 PM

Well crap, you just rained all over that idea :smile: Bah, what do I care, I can't afford SLI anyways lol

<font color=green> Woohoo!! I am officially an <b> Enthusiast </b>!! </font color=green>
February 28, 2005 8:28:28 PM

well another analogy you can use is making a movie,
if the movie was shot in order and if one process had to be finished before another was started it would take years for a movie to get done. In relation to computers i think several cpu`s could be utilized as long as one main cpu still existed. for example a 733 mhz cpu could work with the gpu and do some prerendering another cpu could work with the sound card another can manage the buses and ect.,
i`m not trying to say the cpu`s would do heavy duty tasks
but rather take some of the load off the cpu the same way the gpu takes some of teh load of the cpu. the cpu and gpu can co-exist so why not a cpu and an AI scripting cpu
or a physics cpu or a hdd/optical managing cpu. anotherwords these extra cpu`s would act as buffers to the cpu complete with small amounts of memory (8 mb maybe?)
i mean i could be completly off my rocker and not understand enough about computers but this seems like a decent idea and considering a 733 mhz pentium is only 20 dollars it could be inexpensive if done right.

hear me now, believe me later, trust me in between
i`m a cop you idiot
i`ll be back
February 28, 2005 8:43:17 PM

Not giving up are you ?
:) 

>In relation to computers i think several cpu`s could be
>utilized as long as one main cpu still existed. for example
>a 733 mhz cpu could work with the gpu and do some
>prerendering another cpu could work with the sound card
>another can manage the buses and ect.,

Your computer is already full of specific purpose "processors", for things like Audio (audiocard, if you dont use AC97), video (GPU), networking (NIC), memory controller, etc, etc. This is feasable if the workload is very different from what a CPU is good at, which is most obvious for video. for the other tasks, if there is any trend, it is to let the CPU handle them, as CPU's are dirt cheap for the processing power they offer.

Now expanding your idea to dedicated "AI" processors or "Physics" processors might in theory perhaps possibly be doable, if these chips are fast enough, cheap enough, and if the communication between these chips and the CPU aren't the bottleneck (which will be the real problem IMHO). But then consider how limited game developpers would suddenly become in their creativity....

>considering a 733 mhz pentium is only 20 dollars it could be
> inexpensive if done right.

You do realize a P4 660 doesn't cost intel anything more to produce, right ?

= The views stated herein are my personal views, and not necessarily the views of my wife. =
February 28, 2005 10:51:25 PM

well i`d like to think that faster progress could be made with computers without having to wait for the pigs at intel, amd, nvidia or ati having to milk us for every cent we have by releasing small steps in cpu/gpu progress; i mean in what 6 months the old stuff is practically obsolete its disgusting and it doesn`t have to be that way. look at the consoles for instance they release a new set of consoles and it takes what 2 years for computers to catch up and now why is that? Its because they make the most off the software, computer hardware developers do not have that advantage so it as to be up to hackers and modders and such to develope this technology and i think it can happen.

i don`t think this tech would limit developers (who i might add write sloppy and extensive code) but give them extra tools if for instance a AI cpu was utilized (which i have been wondering for the longest time why one hasn`t been developed) the programmers could write more extensive and dedicated code, these scripts would not have to be shared by other processes.

i think what i`m suggesting is a total reworking of how computers operate and are developed and a more user defined direction.

hear me now, believe me later, trust me in between
i`m a cop you idiot
i`ll be back
February 28, 2005 10:54:35 PM

let me also just add that home computers were first developed by guys just like you and i
so if it means taking a step back before you can take a leap ahead it might be worth it right?
lets take our computers back from these fascists

hear me now, believe me later, trust me in between
i`m a cop you idiot
i`ll be back<P ID="edit"><FONT SIZE=-1><EM>Edited by arnold873 on 02/28/05 10:14 PM.</EM></FONT></P>
March 1, 2005 12:28:55 PM

Quote:
for example a 733 mhz cpu could work with the gpu and do some prerendering another cpu could work with the sound card another can manage the buses and ect.,
i`m not trying to say the cpu`s would do heavy duty tasks
but rather take some of the load off the cpu the same way the gpu takes some of teh load of the cpu. the cpu and gpu can co-exist so why not a cpu and an AI scripting cpu
or a physics cpu or a hdd/optical managing cpu. anotherwords these extra cpu`s would act as buffers to the cpu complete with small amounts of memory (8 mb maybe?)

As was already said, this is what the other chips in other pieces of hardware are there for. The NIC has its own processor. The modem has its own processor. The sound card has its own processor. A RAID controller has its own processor. The northbridge and southbridge are their own processors. Each device generally has a task-specific processor built in that is tailored to its exact function, just like the GPU is for graphics.

So you're already talking out your <explitive never entered> because this already <i>is</i> how computers are built.

Quote:
i mean i could be completly off my rocker and not understand enough about computers but this seems like a decent idea and considering a 733 mhz pentium is only 20 dollars it could be inexpensive if done right.

Let me straighten this out for you. You <i>are</i> completely off of your rocker. You <i>don't</i> understand enough about computers.

You are failing to take into account one highly important factor: timing. Even if you can distribute the workload, you still have to ensure that the results of the distribution are re-assembled in the same order that they were programmed to be in. You can't write a file from memory to a CD when you haven't moved that file from the hard drive into memory yet. You can't render an image through the GPU onto the screen when you haven't loaded that image yet. Chronological order is <i>highly</i> important. And so long as the code is written to execute in a specific order, you have to preserve that order, even if it means wasting CPU cycles while waiting on another device.

All that you would ever accomplish with your idea is to shunt some of the processing from the main CPU onto secondary CPUs so that the main CPU can idle, waiting for the secondary CPUs to finish. Not only do you get nothing done any faster (in actuality it'd be even slower than using just one CPU) but it costs a fortune more.

arnold873, just give it up. You clearly don't have a clue. Everyone is trying so hard to nicely hand you that clue. But you're too stubborn to pay attention. So here it is bluntly. Give up this silly idea and go out and learn. You're not looking any smarter for blindly championing the futile while we all try to show you the error of your way.

<pre>Antec Sonata 2x120mm
P4C 2.6
Asus P4P800Dlx
2x512MB CorsairXMS3200C2
Leadtek A6600GT TDH
RAID1 2xHitachi 60GB
BENQ 16X DVD+/-RW
Altec Lansing 251
NEC FE990 19"CRT</pre><p>
March 1, 2005 12:50:21 PM

Quote:
well i`d like to think that faster progress could be made with computers without having to wait for the pigs at intel, amd, nvidia or ati having to milk us for every cent we have by releasing small steps in cpu/gpu progress;

Amazing.

Quote:
i mean in what 6 months the old stuff is practically obsolete its disgusting and it doesn`t have to be that way.

You're right. They should just stop releasing upgrades all together and keep us at the same processor for years. Is that what you want?

Quote:
look at the consoles for instance they release a new set of consoles and it takes what 2 years for computers to catch up and now why is that?

Wow. You're stunning, you know that don't you? I mean it wasn't like the XBox was an Intel CPU with nVidia graphics and sound crammed into a small form factor case that PCs matched and then exceeded the performance of in very short order. No. It was nothing like that at all. It took two years for computers to catch up. Yeah. Right.

Quote:
Its because they make the most off the software, computer hardware developers do not have that advantage so it as to be up to hackers and modders and such to develope this technology and i think it can happen.

Do you even realize how little sense you just made?

Quote:
i don`t think this tech would limit developers (who i might add write sloppy and extensive code)

So now you're saying that all software developers write sloppy code?

Quote:
but give them extra tools if for instance a AI cpu was utilized (which i have been wondering for the longest time why one hasn`t been developed)

Oh gee, I don't know, maybe there's no such thing as an AI CPU because:
1) No two AIs are alike.
2) AI is an emerging <i>growing</i> concept that would be extremely limited by standardization to a set API that would <i>always</i> be several steps behind what cutting-edge developers needed? (Remind anyone of OpenGL?)

Quote:
i think what i`m suggesting is a total reworking of how computers operate and are developed and a more user defined direction.

And <b>I</b> <i>think</i> that what you're suggesting is <i>complete idiocy</i>. But then opinions are like arseholes. We all have our own, yada yada.

Quote:
let me also just add that home computers were first developed by guys just like you and i
so if it means taking a step back before you can take a leap ahead it might be worth it right?

You do that. Go away and develop your own computer. Take matters into your own hands. Are you going to whine and moan like a sissy, or are you going to <i>do</i> something about it? Go do something.

Quote:
lets take our computers back from these fascists

Do you even know what the word 'fascist' means? Are you seven years old or something?

<font color=red><b>Frankly, you've amazed me. I do honestly believe that you are <i>THE</i> most immature poster yet to soil these boards.</b></font color=red> You should get an award.

<pre>Antec Sonata 2x120mm
P4C 2.6
Asus P4P800Dlx
2x512MB CorsairXMS3200C2
Leadtek A6600GT TDH
RAID1 2xHitachi 60GB
BENQ 16X DVD+/-RW
Altec Lansing 251
NEC FE990 19"CRT</pre><p>
Anonymous
a b à CPUs
March 1, 2005 2:27:34 PM

Wow, talk about going off.
Quote:
But then opinions are like arseholes. We all have our own, yada yada.

Nice Dirty Harry quote btw :smile:

<font color=green> Woohoo!! I am officially an <b> Enthusiast </b>!! </font color=green>
<i> <font color=red> One new Firefox fan </font color=red> </i>
March 1, 2005 3:03:32 PM

So you're pretty much saying that that idea is screwed since most games (engines) aren't multithreaded...=/

I can accept that...if i have to.
"i am a man...<belch>...and i can change...if i have to"

Current machines running F@H:
Athlons: [64 3500+][64 3000+][2500+][2000+][1.3x1][366]
Pentiums: [X 3.0][P4 2.4x5][P4 1.4]

It's not worth saying unless it takes a really long time to say!
March 1, 2005 3:42:49 PM

first i have to say that people thought howard huges and his like were crazy. Second you have no imagination.

you don`t think intel and amd are milking computers users?
i mean come on they thave roadmaps of slight improvement until 2010. HT? what a bunch of crap people paid extra for


and it did take quite a long time for computers to catch up to the consoles, especially consedering a xbox only had a 733 in it and pcs had much more.

software developers so write very sloppy code look at windows xp for a great example. one of the best operating systems ever (amiga dos) ran on a 7mhz system and took no time to load and almost never crashed, now it takes up to a full minute to load an OS onto systems hundreds of times faster.

Quote:

Oh gee, I don't know, maybe there's no such thing as an AI CPU because:
1) No two AIs are alike.
2) AI is an emerging growing concept that would be extremely limited by standardization to a set API that would always be several steps behind what cutting-edge developers needed? (Remind anyone of OpenGL?)


YOU sound like an idiot first the cpu can be programed to do more than one thing
second what your going to what till its all standardized to do something about it, something that could never have a standard mind you.


and lastly i have to say most people have been very nice on these boards but you are a loser. there is no need to poke personal remarks at me. I`m trying to have a discussion here
and i realize that there are shortfalls in my ideas and theories but there is no need for you to act superior because you are not. and the question of fascists? you need to go back to school and learn waht a fascist is, or perhaps you already do because you are acting like one.

hear me now, believe me later, trust me in between
i`m a cop you idiot
i`ll be back<P ID="edit"><FONT SIZE=-1><EM>Edited by arnold873 on 03/01/05 12:43 PM.</EM></FONT></P>
March 1, 2005 3:56:10 PM

Quote:
again if cpu`s were set in parallel as opposed to a task oriented way as i propose. the cpu`s could build on previos cpu`s work? anotherwords one cpu would ready the work area with pen paper etc. the second would brainstorm, the third would do a rough draft, the forth would do a draft and the fifth would proofread and finalize.

The <A HREF="http://en.wikipedia.org/wiki/Von_Neumann_architecture" target="_new">von Neumann</A> architecture of modern computers simply makes this the parrellization of the described task impossible. In fact, you have also just outlined a 'von Neumann' type process where one thing has to happen one after another, i.e. the output of one piece of code has to input into the next etc. etc. It's such a pervasive way of thinking it's extremely hard to escape out of; logical thought processes seem to proceed in series. However, there is no doubt that our brains certainly don't process information in series. The brain 'runs' at about 8Hz yet processes inordiante amounts of information. It can do this because it is parrallel. The challenge for future computers will be to find a way to create code that will truly work in parrallel without creating enormous inefficiencies.

Computer programs already do this to some extent through <A HREF="http://en.wikipedia.org/wiki/Instruction_level_parallel..." target="_new">Intruction Level Parrallelism</A>, but it seems as though ILP has gone just about as far as it can, even with HT, hence the computer developers switch to dual core CPUs. And there is no reason to think that it will stop there, the future development of a massively parrallel computer architecture seems inevitable.
March 1, 2005 4:01:22 PM

Quote:

As was already said, this is what the other chips in other pieces of hardware are there for. The NIC has its own processor. The modem has its own processor. The sound card has its own processor. A RAID controller has its own processor. The northbridge and southbridge are their own processors. Each device generally has a task-specific processor built in that is tailored to its exact function, just like the GPU is for graphics.

so let me get this right. the cpu is unable to do what this task specific chips do? WELL hello if cpu`s did them they would be much more changable for future upgrades, anotherwords they can be changed in software as the need or process changed. THe whole idea of computers is that they are not necessarly task specific.

Quote:

You are failing to take into account one highly important factor: timing. Even if you can distribute the workload, you still have to ensure that the results of the distribution are re-assembled in the same order that they were programmed to be in. You can't write a file from memory to a CD when you haven't moved that file from the hard drive into memory yet. You can't render an image through the GPU onto the screen when you haven't loaded that image yet. Chronological order is highly important. And so long as the code is written to execute in a specific order, you have to preserve that order, even if it means wasting CPU cycles while waiting on another device.

so your trying to say that the northbridge would be faster than a cpu
use your head. drives can be made to do almost anything and i`m sure they can do something with the code.

Quote:

All that you would ever accomplish with your idea is to shunt some of the processing from the main CPU onto secondary CPUs so that the main CPU can idle, waiting for the secondary CPUs to finish. Not only do you get nothing done any faster (in actuality it'd be even slower than using just one CPU) but it costs a fortune more.

well what the hell is dual core processeors than?
i guess they won`t work because they will be ideling the whole time and the code is not specific to them. you know you have no insight or vision get another hobby. and as far as people trying to give me a clue, some others have suggested that yes more than on cpu is possible and could be beneficial, so get off your high horse.

Quote:

In SLI, i read somewhere that they said they were maxing out the CPU even with amd 4000s, instead of maxing out the gpus first. In every system there are bottle necks...and for the first time, the bottleneck is supposedly the cpu. What this could mean (i'm just throwing this out there...i don't know how true it is), is that in a SLI formation, where 2 different gpu's are taking care of data, a dual (core?) cpu might be more useful. If cpu0 is taking care of gpu0 and cpu1 is taking care of gpu1, we have the possiblity of increasing our ultimate fps. Now this idea is reliant on several things...one, that the drivers for the gpus are designed to be able to send information to multiple cpus; (which isn't likely to happen in the near future, if at all)...two, that we aren't maxing out our memory; ie, there isn't a bottleneck at the memory; Furthering that idea...if memory is a bn, then could we instead of using 2 cpus on the same chip, could we go by way of the multi xeon's and opterons, with memory allocated to each cpu and see a difference that way... ok this has gotten long enough...


apesoccar wrote that
so try talking to him the way you have to me and see how far it gets you clown

hear me now, believe me later, trust me in between
i`m a cop you idiot
i`ll be back<P ID="edit"><FONT SIZE=-1><EM>Edited by arnold873 on 03/01/05 01:08 PM.</EM></FONT></P>
March 1, 2005 4:40:58 PM

Quote:
first i have to say that people thought howard huges and his like were crazy.

There's a difference between being imaginative and being stupid. Imaginative people come up with exciting new concepts. Stupid people continue to rant about how great something could be even after several people point out that it's already been done that way, and in an even better method.

Quote:
you don`t think intel and amd are milking computers users?

There's a difference between running a business to make money and "milking" customers. Do you honestly believe that Intel and AMD could just mass-produce CPUs ten times as fast at the drop of a hat? And even if they could now, do you <i>really</i> think that they could continue that rate of development? Do you have any concept of platform plateaus? Can you even say a single thing that proves you understand even a fraction of what you're babbling about?

Quote:
and it did take quite a long time for computers to catch up to the consoles, especially consedering a xbox only had a 733 in it and pcs had much more.

Out of curiosity, what world do you live in? In a rather short time nVidia released an AGP card equivalent to the graphics that were in the XBox, and everything else in the PC world was identical or better. You have no proof for your deranged allegations. The reason for that is, by the way, because your allegations are contrary to reality.

Quote:
software developers so write very sloppy code look at windows xp for a great example. one of the best operating systems ever (amiga dos) ran on a 7mhz system and took no time to load and almost never crashed, now it takes up to a full minute to load an OS onto systems hundreds of times faster.

1) Generalizing <i>all</i> software developers based on Microsoft is like generalizing all police based on the Keystone Cops.
2) Have you even heard of Linux? How about Unix, BSD, BeOS, PalmOS, etc. ad nausium? I suppose next you're going to tell me that they're all sloppy code as well?
3) Have you ever thought to consider how much more WinXP does? I'm not saying that WinXP isn't bloated, but geeze, you're comparing an ancient minimally-functional OS to what could quite possibly be the most feature-rich OS written yet. Features take resources.

Quote:
YOU sound like an idiot first the cpu can be programed to do more than one thing
second what your going to what till its all standardized to do something about it, something that could never have a standard mind you.

1) Look at your sentence structure. Now look at mine. One of us has actually learned grade-school level English grammar or higher. One of us can write intelligibly. If not in actual truth, <i>one</i> of us at least <i>sounds</i> intelligent, while the other <i>sounds</i> like a blithering moron.
2) If you make a processor tailored specifically to running an AI, then pray tell how can it do more than just one thing?
3) In theory someone could standardize an AI API. In reality, it'd serve little purpose as it would constantly be lagging behind what would be required by cutting-edge developers. So it <i>could</i> have a standard, but that standard would only serve to detract from the concept's advancement at this point. <i>Which</i> is why leaving the coding up to the programmers and running it on a general purpose CPU is a much more logical and useful solution.

Quote:
and lastly i have to say most people have been very nice on these boards but you are a loser.

Do you honestly believe that I care what you think about me? Sticks and stones, my friend.

Quote:
there is no need to poke personal remarks at me. I`m trying to have a discussion here
and i realize that there are shortfalls in my ideas and theories but there is no need for you to act superior because you are not.

Actually, I thought it quite obvious that in the realms of programming and computers (as well as English grammar), compared to you I <i>am</i> quite superior. This is not to say that I am abnormally gifted compared to the average person on these boards, but simply that you are that far below average (and thusly below myself) when it comes to knowledge of what you have been talking about. This makes you no less of a person in general of course, but simply makes me superior at this present point in time when it comes to these subjects. Such is life. If you are that incapable of accepting your own inferiorities then you have a very rough road ahead of you.

Further, perhaps if you would start listening to people and showing any ability to learn whatsoever instead of, well, being <i>you</i>, then perhaps I would have no need of getting your attention by making personal remarks. <i>Had</i> you listened to everyone being nice to you, I would never have bothered. <i>That</i> I have bothered is actually indication that I see some potential in you, but as you are showing, potential and actuality are two very different things.

Yet further than that, <i>if</i> such mild personal remarks from someone that you've never met in your life (and thusly should have no bearing upon you) have offended you that badly, then perhaps you should be evaluating your self-esteem.

And to finish off, a <i>discussion</i> is a mutually beneficial <i>exchange</i> of ideas. So far all that you've done is repeated your <i>wrong</i> 'facts' and concepts while everyone else tries to kindly correct you. <i>If</i> you were interested in a <i>discussion</i> then you would have actually <i>listened</i> to them (and me) by now and have <i>learned</i> something. However, since we are where we are, it remains rather clear that you have in fact <i>not</i> been <i>discussing</i> anything. Repetition of one's self in spite of overwhelming contrary evidence is <i>not</i> a discussion.

Quote:
and the question of fascists? you need to go back to school and learn waht a fascist is, or perhaps you already do because you are acting like one.

<i>I</i> know. You, clearly, as expressed by your own words, do not. I would <i>highly</i> suggest that on a message board known to be frequented by Germans that you not throw such terms out so carelessly. Ignorance is no excuse.

And finally, I must ask, what <b>is</b> your age?

<pre>Antec Sonata 2x120mm
P4C 2.6
Asus P4P800Dlx
2x512MB CorsairXMS3200C2
Leadtek A6600GT TDH
RAID1 2xHitachi 60GB
BENQ 16X DVD+/-RW
Altec Lansing 251
NEC FE990 19"CRT</pre><p>
March 1, 2005 5:16:53 PM

Quote:
so let me get this right. the cpu is unable to do what this task specific chips do? WELL hello if cpu`s did them they would be much more changable for future upgrades, anotherwords they can be changed in software as the need or process changed. THe whole idea of computers is that they are not necessarly task specific.

If you plan on blathering on about something that you are asking if you have right, it would be wise to at least first <i>have it right</i>.

I never said that a CPU was incapable. In fact, it is quite contrary. Many miniaturized components such as PCMCIA cards, as well as many <i>cheap</i> components use drivers that force the CPU to do all of the work instead of mounting task-specific processors in their hardware. What I <i>said</i> is that computers are already designed with a plethora of task-specific processors to minimize the usage of the general-processing CPU so that the CPU can be used more efficiently. And this is why you're idea of a CPU for this and a CPU for that is moot, because the computer is in fact already designed just so, and in a much better method than you propose.

Quote:
so your trying to say that the northbridge would be faster than a cpu
use your head. drives can be made to do almost anything and i`m sure they can do something with the code.

Again, try listening for a change. I am not saying that they are necessarily faster, but that they are at least <i>as</i> fast, thus enabling that workload to be spent somewhere else than in the CPU. But on the subject of speed and using the northbridge as an example, yes, a northbridge is faster than a CPU when it comes to memory address handling and the memory controller is mounted on the northbridge instead of in the CPU's die. (A current architectural difference between Intel and AMD.) Another example is that a video card's GPU is considerably faster at processing 3D graphics commands than CPU emulation is.

Further, why is it that you are "sure they can do something with the code" to run things faster in a general-purpose CPU and yet at the same time alleging that all software programmers are sloppy? Don't you think that is a contradictory point of view?

Quote:
well what the hell is dual core processeors than?
i guess they won`t work because they will be ideling the whole time and the code is not specific to them.

Earth to arnold873... Have you even done a search at all for people's conversations about dualie boxes? That is exactly what the problem with them is. <i>Most</i> appplications are written single-threaded. As such they <i>won't</i> utilize the resources of the second CPU. Do a little research for a change and then come back when you have a clue. You'll sound much less stupid that way.

Quote:
you know you have no insight or vision get another hobby.

Actually, I have plenty of hobbies. Besides computer programming, building my own PCs, and web page development I also read and write a lot of fantasy\sci-fi novels. During summer I also swim a lot. And then there's of course my wife, which one could call the most expensive hobby of all. However, one of my most favorite and dirtiest pleasures is picking on idiots. ;) 

Quote:
and as far as people trying to give me a clue, some others have suggested that yes more than on cpu is possible and could be beneficial

Try paying attention for a change. Re-read the posts. No one is agreeing with you. They're all pointing out reasons why you're wrong.

Quote:
so get off your high horse.

But the view from here is so entertaining... :-p

Quote:
apesoccar wrote that
so try talking to him the way you have to me and see how far it gets you clown

Firstly, someone else has already corrected apesoccer.

Secondly, apesoccer has proven to be far more capable of a <i>discussion</i> than you. I have no reason accost someone for actually paying attention. Had you shown the same capability for learning we wouldn't be having this conversation.

Thirdly, I'll have you know that I'm a jester, not a clown. My humor is more intellectual than slapstick and I primarily cater to a more noble class. In your case however, I'll make an exception. :-p Get over yourself already. Life's too short to be so serious. You'll give yourself a heart attack by the time you're twenty at the rate you're going.

<pre>Antec Sonata 2x120mm
P4C 2.6
Asus P4P800Dlx
2x512MB CorsairXMS3200C2
Leadtek A6600GT TDH
RAID1 2xHitachi 60GB
BENQ 16X DVD+/-RW
Altec Lansing 251
NEC FE990 19"CRT</pre><p>
Anonymous
a b à CPUs
March 1, 2005 5:36:14 PM

Wow. That was without a doubt <b> THE </b> longest flame I have read on this board so far. Granted I am new here... I guess Arnie hit a nerve or something, you just went off on him like a virgin on prom night. I'm a third party to this whole line of posts, and even <b> I'm </b> stunned. Oh, and btw, Arnie, I'm so proud of all the qouting you did in your last post, kepp that up man :smile:

<font color=green> Woohoo!! I am officially an <b> Enthusiast </b>!! </font color=green>
<i> <font color=red> One new Firefox fan </font color=red> </i>
March 1, 2005 5:55:07 PM

Quote:
Wow. That was without a doubt THE longest flame I have read on this board so far. Granted I am new here...

I aim to entertain, elucidate, and educate. Should this also incur the opportunity to castigate, then all the better. :) 

Seriously though, that was nothing. I've been exceedingly mild. You should have been here a few years back when this place was really flaming. :o 

Quote:
I guess Arnie hit a nerve or something

Actually, I've just got a lot of free time at the moment because my code is taking an exceptional amount of time to compile. (Something I rarely have to do when I primarily work in Python.) Well, that and I'm having trouble coaxing the Python profiler into giving me a clue why our XML parser is taking 2.5 hours to write a single XML file that should only take a minute to write. :\ I had hoped that it would be easier to uncover then this.

Or, in other words, I'm bored. :o 

But I will point out that my flaming is very light. You'll note that the vast majority of it is observation and implication and not something so cheap and tawdry as simple name calling. After all, only rank amateurs really need to rely upon something so base. Professional entertainers strive for so much more. ;) 

Do a little dance, sing a little song...

<pre>Antec Sonata 2x120mm
P4C 2.6
Asus P4P800Dlx
2x512MB CorsairXMS3200C2
Leadtek A6600GT TDH
RAID1 2xHitachi 60GB
BENQ 16X DVD+/-RW
Altec Lansing 251
NEC FE990 19"CRT</pre><p>
Anonymous
a b à CPUs
March 1, 2005 6:17:40 PM

well if I ever need somebody to "observe and implicate" someone, I'll know who to call... :smile:

<font color=green> Woohoo!! I am officially an <b> Enthusiast </b>!! </font color=green>
<i> <font color=red> One new Firefox fan </font color=red> </i>
March 1, 2005 7:18:16 PM

*puts a marshmellow on a stick and holds it next to arnold*

Good judgement comes from experience.
Experience comes from bad judgement.
March 1, 2005 7:18:16 PM

In reply to:
--------------------------------------------------------------------------------

so let me get this right. the cpu is unable to do what this task specific chips do? WELL hello if cpu`s did them they would be much more changable for future upgrades, anotherwords they can be changed in software as the need or process changed. THe whole idea of computers is that they are not necessarly task specific.



--------------------------------------------------------------------------------

If you plan on blathering on about something that you are asking if you have right, it would be wise to at least first have it right.

I never said that a CPU was incapable. In fact, it is quite contrary. Many miniaturized components such as PCMCIA cards, as well as many cheap components use drivers that force the CPU to do all of the work instead of mounting task-specific processors in their hardware. What I said is that computers are already designed with a plethora of task-specific processors to minimize the usage of the general-processing CPU so that the CPU can be used more efficiently. And this is why you're idea of a CPU for this and a CPU for that is moot, because the computer is in fact already designed just so, and in a much better method than you propose.

I DON`T THNK SO


In reply to:
--------------------------------------------------------------------------------

so your trying to say that the northbridge would be faster than a cpu
use your head. drives can be made to do almost anything and i`m sure they can do something with the code.



--------------------------------------------------------------------------------

Again, try listening for a change. I am not saying that they are necessarily faster, but that they are at least as fast, thus enabling that workload to be spent somewhere else than in the CPU. But on the subject of speed and using the northbridge as an example, yes, a northbridge is faster than a CPU when it comes to memory address handling and the memory controller is mounted on the northbridge instead of in the CPU's die. (A current architectural difference between Intel and AMD.) Another example is that a video card's GPU is considerably faster at processing 3D graphics commands than CPU emulation is.

SO WOULDN`T MORE CPUS DO IT QUICKER

Further, why is it that you are "sure they can do something with the code" to run things faster in a general-purpose CPU and yet at the same time alleging that all software programmers are sloppy? Don't you think that is a contradictory point of view?

NOT ALL BUT MOST, SINCE CPU`S ARE QUICKER THEY HAVE LESS REASON TO BE FRUGAL. THATS TO SAY I DON`T SEE A MARKED DIFFERENCE IN WINDOWS XP BUT IT SURE DOES TAKE MORE RESOURCES. THE SAME WITH A GAME LIKE HALF-LIFE, IT TAKES 10 TIMES MORE COMPUTING POWER BUT I CERTAINLY DON`T THINK THE GRAPHICS ARE 10 TIMES BETTER

In reply to:
--------------------------------------------------------------------------------

well what the hell is dual core processeors than?
i guess they won`t work because they will be ideling the whole time and the code is not specific to them.



--------------------------------------------------------------------------------

Earth to arnold873... Have you even done a search at all for people's conversations about dualie boxes? That is exactly what the problem with them is. Most appplications are written single-threaded. As such they won't utilize the resources of the second CPU. Do a little research for a change and then come back when you have a clue. You'll sound much less stupid that way.

FIRST OF ALL I KNOW THAT JERK-OFF, AND SECOND THE APPLICATIONS WILL HAVE TO BE THERE AT SOME POINT.
IF ONE BUILT A PC WITH WHAT I CALL "CAIC" CUMULATIVE ARRAY OF INEXPENSIVE CPU`S THE SOFTWARE WILL BE THERE AT SOME POINT. INTELL AND AMD ARE BOTH MAKING DUAL AND QUAD CORE CPUS SO A PC WITH 2 OR FOUR REGULAR CORE CPUS WOULD IN FACT OPERATE THE SAME WAY AND BE CHEAPER AND FASTER. THE DUAL CORE PENTIUMS ARE GOING TO BE NOT ONLY CLOCKED DOWN ANYWAY BUI ALSO MORE EXPENSIVE. OLD TECH WILL BE MUCH CHEAPER AND FASTER. HOW MUCH WILL 2 PENTIUM 2.4 GHZ COST, CERTAINLY LESS THAN A DUAL CORE PENTIUM CLOCKED THE SAME


In reply to:
--------------------------------------------------------------------------------

you know you have no insight or vision get another hobby.



--------------------------------------------------------------------------------

Actually, I have plenty of hobbies. Besides computer programming, building my own PCs, and web page development I also read and write a lot of fantasy\sci-fi novels. During summer I also swim a lot. And then there's of course my wife, which one could call the most expensive hobby of all. However, one of my most favorite and dirtiest pleasures is picking on idiots. ;) 

YOU ARE STILL A LOSER AND I`M SURE YOUR WIFE HATES YOU BECAUSE YOU ARE INSECURE WITH YOUR SELF FOR HAVING TO PICK ON PEOPLE YOU PERCEIVE AS BEING LESS THAN YOURSELF

In reply to:
--------------------------------------------------------------------------------

and as far as people trying to give me a clue, some others have suggested that yes more than on cpu is possible and could be beneficial



--------------------------------------------------------------------------------

Try paying attention for a change. Re-read the posts. No one is agreeing with you. They're all pointing out reasons why you're wrong.

NO U NEED TO READ THERE ARE SEVERAL PEOPLE YOU QUESTION IT
AND FOR THE REASONS I POINT OUT ABOVE ;
DUAL CORE IDIOT!!!!!!!!!!


In reply to:
--------------------------------------------------------------------------------

so get off your high horse.



--------------------------------------------------------------------------------

But the view from here is so entertaining... :-p

UNDER MY SHOE WOULD BE BETTER


In reply to:
--------------------------------------------------------------------------------

apesoccar wrote that
so try talking to him the way you have to me and see how far it gets you clown



--------------------------------------------------------------------------------

Firstly, someone else has already corrected apesoccer.

SO HE STILL THOUGHT IT WAS POSSIBLE SOMETHING YOU SAID NO ONE ELSE DID

Secondly, apesoccer has proven to be far more capable of a discussion than you. I have no reason accost someone for actually paying attention. Had you shown the same capability for learning we wouldn't be having this conversation.

I HAVE 3 DEGREES MY CAPABILITY FAR EXCEEDS YOURS I AM SURE

Thirdly, I'll have you know that I'm a jester, not a clown. My humor is more intellectual than slapstick and I primarily cater to a more noble class. In your case however, I'll make an exception. :-p Get over yourself already. Life's too short to be so serious. You'll give yourself a heart attack by the time you're twenty at the rate you're going.

SO STOP SCREWING WITH ME THEN
I DID NOT BOTHER YOU
I HAVE ONLY THROWN OUT A QUESTION
AND WANTED TO DISCUSS IT
I DON`T NEED SOME ASS CLOWN (CORRECTED)
BOTHERING ME


hear me now, believe me later, trust me in between
i`m a cop you idiot
i`ll be back
March 1, 2005 7:56:20 PM

Quote:

There's a difference between being imaginative and being stupid. Imaginative people come up with exciting new concepts. Stupid people continue to rant about how great something could be even after several people point out that it's already been done that way, and in an even better method.

I THOUGHT YOU SAID IT COULDN`T BE DONE THAT WAY
HELLO
AND LEARN WHAT THE WORD CREATIVE MEANS
PEOPLE SID THE SPUCE GOOSE WOULDN`T FLY AND IT DID
SO MAYBE HOWIE SHOULD HAVE LISTENED TO ALL THESE LOSERS THAT SAID "HEY IT CAN`T BE DONE THAT WAY"
BAKA
Quote:

There's a difference between running a business to make money and "milking" customers. Do you honestly believe that Intel and AMD could just mass-produce CPUs ten times as fast at the drop of a hat? And even if they could now, do you really think that they could continue that rate of development? Do you have any concept of platform plateaus? Can you even say a single thing that proves you understand even a fraction of what you're babbling about?


WHAT DO YOU WORK FOR THESE GUYS? WELL WHEN LETS SAY WHEN ATI PUTS OUT ITS VIDEO CARDS THEY HAVE DIFFERENT LEVELS (DEPENDING ON PRICE POINTS) BUT THEY PUT THE SAME CHIP IN ALL THE CARDS AND ONLY ALLOW THE ACCESS TO PART OF THE PIPES DEPENDING ON WITH LEVEL YOU BOUGHT I THINK THATS AN ABUSE. WHEN INTEL STARTS CHARGING FOR HT TECHNOLOGY EVEN THOUGH IT WAS IN A LOT OF THE PRIOR CHIPS BUT NOT TURNED ON, THATS AN ABUSE

Quote:

Out of curiosity, what world do you live in? In a rather short time nVidia released an AGP card equivalent to the graphics that were in the XBox, and everything else in the PC world was identical or better. You have no proof for your deranged allegations. The reason for that is, by the way, because your allegations are contrary to reality.



WELL HERES A FACT; THE XBOX COSTS 150 DOLLARS, RIGHT?
AND HOW MUCH DID THE NVIDIA VIDEO CARD COST?
THAT WAS COMPARABLE I BELIEVE IT WAS THE NV40 WHICH DIDN`T COME OUT UNTIL MAYBE 1.5-2 YEARS LATER

Quote:

1) Generalizing all software developers based on Microsoft is like generalizing all police based on the Keystone Cops.
2) Have you even heard of Linux? How about Unix, BSD, BeOS, PalmOS, etc. ad nausium? I suppose next you're going to tell me that they're all sloppy code as well?
3) Have you ever thought to consider how much more WinXP does? I'm not saying that WinXP isn't bloated, but geeze, you're comparing an ancient minimally-functional OS to what could quite possibly be the most feature-rich OS written yet. Features take resources.


I DON`T CARE, U MUST BE ONE OF THOSE LAZY PROGRAMMERS

Quote:

1) Look at your sentence structure. Now look at mine. One of us has actually learned grade-school level English grammar or higher. One of us can write intelligibly. If not in actual truth, one of us at least sounds intelligent, while the other sounds like a blithering moron.
2) If you make a processor tailored specifically to running an AI, then pray tell how can it do more than just one thing?
3) In theory someone could standardize an AI API. In reality, it'd serve little purpose as it would constantly be lagging behind what would be required by cutting-edge developers. So it could have a standard, but that standard would only serve to detract from the concept's advancement at this point. Which is why leaving the coding up to the programmers and running it on a general purpose CPU is a much more logical and useful solution.

I`M NOT WRITING A BOOK I`M ANSWERING AN IDIOT AND I DO NOT CARE ABOUT SENTENCE STRUCTURE. ANY QUESTIONS?
AND THE CU FOR THE AI WOULD BE GENERAL THATS WHY I`M PROPOSING RUNNING IT ON A CPU, DUH!

Quote:

And to finish off, a discussion is a mutually beneficial exchange of ideas. So far all that you've done is repeated your wrong 'facts' and concepts while everyone else tries to kindly correct you. If you were interested in a discussion then you would have actually listened to them (and me) by now and have learned something. However, since we are where we are, it remains rather clear that you have in fact not been discussing anything. Repetition of one's self in spite of overwhelming contrary evidence is not a discussion.


HAD YOU THE ACTUAL ABILITY TO READ AND THINK YOU MIGHT HAVE SEEN THAT I HAVE CHANGED MY VIEWS ON MY ORIGINAL IDEA SEVERAL TIMES AND AS FAR AS THE SUPERIOR THING GOES, BLOW ME.

Quote:

I know. You, clearly, as expressed by your own words, do not. I would highly suggest that on a message board known to be frequented by Germans that you not throw such terms out so carelessly. Ignorance is no excuse.


FIRST OF ALL I AM GERMAN. SECOND NOT ONLY GERMANS WERE FASCISTS THE ITALIANS WERE TOO, SO YOU FORGOT THEM.
AND LASTLY WHAT DOES THE FACT THAT I BRING UP FASCISM HAVE ANYTHING TO DO WITH BE CARELESS. THE FACT THAT GERMANY WAS A FASCIST STATE DURING WWII IS A FACT AND I DID NOT BRING IT UP TO OFFEND GERMANS OR ITALIANS FOR THAT MATTER. I BROUGHT IT UP TO ILLUSTRATE YOUR INABILITY TO SEE WHAT IS GOING TO BE THE FUTURE, YOU ARE ONE SIGHTED AND SEEK TO PERSERVE YOUR PERCEIVED SUPERIORITY BY PUTTING OTHERS DOWN AND TRYING TO SUPPRESS THEM JUST LIKE THE FASCISTS. AS A MATTER OF FACT YOUR WORST THAN THE FASCISTS YOUR LIKE STALIN. OK COMRADE

MY AGE IS OF NO IMPORTANCE TO YOU
JUST KEEP IN MIND I`M OLD ENOUGH TO SPANK YOU LIKE ONE OF MY CHILDEN

hear me now, believe me later, trust me in between
i`m a cop you idiot
i`ll be back<P ID="edit"><FONT SIZE=-1><EM>Edited by arnold873 on 03/01/05 05:00 PM.</EM></FONT></P>
March 1, 2005 9:47:01 PM

Arnold,

Don't get too worked up by SLVR, he likes to hear himself talk, thinks he's pretty clever. But, in this case, he is right..

>WELL HERES A FACT; THE XBOX COSTS 150 DOLLARS, RIGHT?

It may cost you $150, but it costs MS more. They make money on the games, not on the hardware, a very different business model from PCs. As for consoles being "superior", keep in mind consoles pretty much ONLY run games; they do not offer nearly as much flexibility as our PC's do, and of course the downside of such flexibility is some loss of performance. Thats why, arguable, XBOX games at first where better than PC games in terms of 3D quality, even though PC hardware was basically identical.

As for your dual core argument; like I said, it depends on the workload if it will make a difference yes or no. For rendering, media encoding, compilation and such, yes there will be a benefit. For games.. no, or at least not anytime soon. A dual (quad/octa) core cpu will not give you *any* advantage over a single core cpu for games.

Finally, there is an economic side in this argument you are missing. Outdated cpu's aren't any cheaper to produce than current high end cpu's. Basically, the only thing that matters is die size (assuming reasonable yields and binsplits). Your Pentium 733 really isnt cheaper than a P4 560 to produce, the reason one costs 30x as much is just supply and demand. And of course Intel and AMD need to recover billions of dollars in R&D and the fixed costs of running a fab. In other words, if you'd start selling some weird PC with 8 outdated pentiums II's in them, and it became in instant success, it would soon cost more than our current machines, while performing very much slower on 99% of the code out there.

Oh, and don't forget its non trivial to design motherboards that can accomodate more than 1 cpu using high speed FSB's without running into timing issues. One @800 Mhz is hard, two is damn hard, 4+ is so incredibly hard, requires so many layers, the boards alone costs >$1000 to produce.

Trust me, if there where a benefit to your approach, someone would have done it already. fact is, for server workloads, there is such a benefit, and there are plenty of dual/multi core chips available and in the making. More extreme implementations like Suns Niagra or even Cell are also around the corner, but rest assured such solutions will s*uck balls on our current desktop/gaming software. If you want to be clever and invent something new, find a way to make a game engine really benefit from multithreading. The hardware part of the solution is easy, and already done, just not usefull yet.

= The views stated herein are my personal views, and not necessarily the views of my wife. =
March 2, 2005 4:38:21 AM

i agree slvr dldo thinks he`s clever but he`s not.
i`ve already realized that my idea of more than one cpu was not the best idea but it does seem like an idea that they are in fact doing with dual core. the fact that it might not make any difference with games does not seem to bother intel. i just don`t appreciate slvr dldo acting superior to myself. he seems to think the idea is impossible which it is not, the fact that it would cost a lot and not do anything with present software i am not going to argue.

Quote:

If you want to be clever and invent something new, find a way to make a game engine really benefit from multithreading. The hardware part of the solution is easy, and already done, just not usefull yet.

i think this is more something for slvr dldo.


hear me now, believe me later, trust me in between
i`m a cop you idiot
i`ll be back
March 2, 2005 4:55:19 AM

In reply to:
--------------------------------------------------------------------------------


1) Generalizing all software developers based on Microsoft is like generalizing all police based on the Keystone Cops.
2) Have you even heard of Linux? How about Unix, BSD, BeOS, PalmOS, etc. ad nausium? I suppose next you're going to tell me that they're all sloppy code as well?
3) Have you ever thought to consider how much more WinXP does? I'm not saying that WinXP isn't bloated, but geeze, you're comparing an ancient minimally-functional OS to what could quite possibly be the most feature-rich OS written yet. Features take resources.




--------------------------------------------------------------------------------


what i`m saying is i think a lot of programmers are writing sloppy too extensive code. a while back toms had a link to a very small very well written game i believe aroung 95k
and the thing was unbelievable for so little code. now why do games like the new axis and allies require so much cpu power? the graphics are not any better than lets say age of empires (which i might add has much better AI)
axis and allies REQUIRES 1.5 ghz
and recommends 2ghz
if i`m not mistaken age of empires required 133mhz
this is sloppy programming
games are frequently sent out not completed
requiring numerous patches

lets get going slacker
do some free overtime



hear me now, believe me later, trust me in between
i`m a cop you idiot
i`ll be back
March 2, 2005 8:13:48 AM

Quote:
*puts a marshmellow on a stick and holds it next to arnold*

Can I have a few of those marshmellows?[/pulls up chair next to confoundicator}



READ THE STICKY AND WIN A PRIZE! ALL PRIZES CAN BE CLAIMED IN THE SECTION TITLED "THE OTHER"
March 2, 2005 8:18:45 AM

Arnold you are a little whacked, but im glad that you figured out how to do the quotes :tongue: There was a discussion going on "In that OTHER place" about you being on speed :lol:  Im starting to think that you may be :tongue: Sleep Arnold, Sleep is what you need. :eek: 

READ THE STICKY AND WIN A PRIZE! ALL PRIZES CAN BE CLAIMED IN THE SECTION TITLED "THE OTHER"
!