Sign in with
Sign up | Sign in
Your question

32-Core Processors: Intel Reaches For (The) Sun

Last response: in Memory
Share
July 10, 2006 2:38:28 PM

What is the processor future like? Project Keifer could trigger a major core count increase, four threads per core, integrated memory controllers (yes!) and a ring-type interconnect for cores and L3 caches.

Speak out in the Toms's Hardware reader survey!
July 10, 2006 3:05:14 PM

Yes we read it too. Thanks of reposting it though :roll:

Does it seem so impressive? Even though 10 years ago we were all dreaming of a processor that could do the unthinkable feat.....reach 1 ghz 8O
July 10, 2006 3:34:22 PM

It sounds nice, but I'm not holding my breath until 2010 for something Intel wishes to use against a current product.
Related resources
July 10, 2006 3:42:20 PM

Performance gains of X16 is whats most impressive in only 3 1/2 years. I think thats what the article was meant to imply. Well all know that technology tends to double each time out, but its happening in half the time. Kewl. :D 
July 10, 2006 4:30:20 PM

Quote:
Yes we read it too. Thanks of reposting it though :roll:


It's not a re-post. This is the official discussion, linked to the article. There's a link at the bottom of the first page (I thought you read it? ;) )

Quote:
At the same time, we've heard rumors that the project might already [be] dead.

Yeah just a little edit ;) 

Quote:
The second bottleneck is the system's main memory. It is not a part of the processor, but resides in the chipset northbridge on the motherboard.

The memory doesn't reside on the northbridge, per se. The controller resides on the northbridge. The memory resides on JDEC DIMMs. Just nitpicking, I know :) 

Quote:
The key for these wet dreams is a modular design approach that is based on eight processing nodes, each carrying a common 3 MB L2 cache (24 MB total) and four processor cores with 512 kB shared L2 cache.


From Wikipedia- "Wet Dream": A nocturnal emission is an ejaculation of semen experienced during sleep.

Kind of an off-color comment, I thought.
July 10, 2006 4:33:58 PM

they already have 32+ computing, google: Blade Server :D  j/k

ya if these ever do come to fruition i can't imagine the power requirements for 32 cores even if they were only 25 watts each, thats still a chip eating up 800 watts of power. to be feasible each core couldn't use more than 4 watts.
July 10, 2006 4:39:46 PM

When are they going to come up with the technology to tap the power of the human brain :?: That would be cool to turn your brain into the ultimate gaming machine :!: Of course all those smart people would have the best systems. :D 
July 10, 2006 4:41:05 PM

Quote:
ABIT IC7-MAX3 (Garbage)


do you say that because it doesn't OC very well ?

i gave up on my own Max 3. it has a 3 GHz P4 that stays at 3 GHz.
July 10, 2006 4:53:16 PM

Quote:
When are they going to come up with the technology to tap the power of the human brain :?: That would be cool to turn your brain into the ultimate gaming machine :!: Of course all those smart people would have the best systems. :D 


most people's brains are the same, just they are utilized differently, unless you have some sort of neural disorder. (like me)
July 10, 2006 4:59:35 PM

Not Fair!! :lol: 
Absolutely no upgrade path. :cry: 
July 10, 2006 5:04:39 PM

sure there is well just have to develop cyber brains :D  a la Ghost in the Shell :D 
July 10, 2006 5:10:17 PM

Intel is going berserk! What are they doing?? 8O
Like lcdguy said, each core shouldn't use more than 3-4W to keep the TDP within a normal range.
Besides, each node (4 cores with 512K L2) is ~25 mm2. Multiply that by 8 and add the 24MB cache... Isn't that too big for a die??
July 10, 2006 5:14:24 PM

If the goal is to have the most cores, then great. However, my goal is to have the best performance. I don't see how software is going to use 32 threads at once. I think dual-core is a good fit now, I think quad-core is perhaps overkill, 16 and 32 is just insane.

Unless the way software is written radically changes it just doesn't make sense. Why is Intel going from one bad extreme to another? There's a got to be a happy medium!

You can put 1,024 cores on a chip, but what if you are only using 3 or 4 of them, pretty silly I think!

PS - Why is this in the memory category...again...?
July 10, 2006 5:22:42 PM

I think with the shift in multi core processors become so popular, a lot of programmes will start developing new methods of programming. After all, this is in four years. There's a lot that can happen with both hardware and sofware. Who knows, software might become just as modular as the processors, and be able to have many segements of code split up to different cores. We'll see. If not, then I guess you'll at least have a computer that can game, rip CD's, encode DVD's, play music, make coffee, and 27 other things all at the same time. :) 
July 10, 2006 6:10:03 PM

personally i don't see 32 core chips being ecnomically feasible for the home user since dual/quad core will be more than enough. but a 32 core chip would greatly decrese the overhead for servers. for example with a quad 32 core chip server you would have 128 cores to run your server app in one server box. :D 
July 10, 2006 6:19:53 PM

Quote:
If the goal is to have the most cores, then great. However, my goal is to have the best performance. I don't see how software is going to use 32 threads at once.

You can put 1,024 cores on a chip, but what if you are only using 3 or 4 of them, pretty silly I think!


The article is about CPUs for a server environment. They can easily use 32 threads at once. :roll:
July 10, 2006 6:24:32 PM

Quote:
The key for these wet dreams is


I agree, very inappropriate and unprofessional sexual comment for an article about computers. This should be removed.
July 10, 2006 6:45:42 PM

Quote:
If the goal is to have the most cores, then great. However, my goal is to have the best performance. I don't see how software is going to use 32 threads at once.

You can put 1,024 cores on a chip, but what if you are only using 3 or 4 of them, pretty silly I think!


The article is about CPUs for a server environment. They can easily use 32 threads at once. :roll:

Aren't the extra cores required for transporter technology? What type of processors does the Enterprise use? Speaking of which, what about the warp drive? I'm sure it needs the extra cores. :lol: 
July 10, 2006 6:46:27 PM

Aye, this idea will only be usable by servers and/or real process crunching programs (IE, graphics/movie rendering and effects, HD video rendering, etc.) Thats all great for them, but for the home use, I don't see MUCH use. But again, this is what everyone said when the Wright brothers wanted to fly. So...we'll see what happens, if a 32 core chip comes out, is affordable, and shows performance gains, SIGN ME UP! If it doesn't no big deal, I'll just stick with a quad core longer is all.
July 10, 2006 6:47:52 PM

I hope the price break down would be the same as Conroe.
July 10, 2006 6:48:04 PM

yup, but even then not everyone can afford an aircraft but can only afford to use one every once in a while.
July 10, 2006 6:54:47 PM

Quote:
If the goal is to have the most cores, then great. However, my goal is to have the best performance. I don't see how software is going to use 32 threads at once. I think dual-core is a good fit now, I think quad-core is perhaps overkill, 16 and 32 is just insane.

Unless the way software is written radically changes it just doesn't make sense. Why is Intel going from one bad extreme to another? There's a got to be a happy medium!

You can put 1,024 cores on a chip, but what if you are only using 3 or 4 of them, pretty silly I think!

PS - Why is this in the memory category...again...?


This is not silly at all and it makes a lot of sense.
All of these are for servers and server apps are highly multi-threaded.
They are not meant for you to only browse Internet at home.

The current Niagara has 8-core 32-thread, while the upcoming Niagara 2 has 8-core 64-thread.
They perform extremely well when handling tens of requests at the same time.
Go to eBay.com and you can see it is powered by Sun.
I wonder what Google is using.

I agree with Heyyou27. Sun is already selling 8-core 32-thread machines.
Will this 32-core be cutting-edge in 2010?
July 10, 2006 7:32:32 PM

well imagine this for a rendering farm / calculation farm

a balde setup with 8 way server boards you could compress an incredible amount of computer into a very small space :D 
July 10, 2006 9:21:04 PM

Quote:
The key for these wet dreams is


I agree, very inappropriate and unprofessional sexual comment for an article about computers. This should be removed.

I think the term 'wet dream' has moved more into common vernacular over the years and has come to be accepted under the more colloquial meaning of 'something that is really good'. When reading the article, the term didn't stand out as out of place to me at all.
July 10, 2006 9:28:06 PM

Multi-core programming IS in the works. Lots of people knew this coming for a long time.
Some info HERE
So yes, there will be software that can actually use more than 2 cores.
The trick is about unblocking the other 31 cpus for threads, then just add threads without worrying about locking and syncing.
It just won't happen easily right now.
July 10, 2006 9:55:00 PM

With this much processing power maybe I could be playing Quake 10 on my own trek style holo deck in 10 years time.
July 10, 2006 10:26:13 PM

Quote:
I wonder what Google is using.
I hear that they use opterons pretty much exclusivly, supposidly, they put in an order for like 500 new opteron servers a couple of months ago (read it in some article somewhere, just google it :lol:  ).
July 10, 2006 10:30:30 PM

TC,

You really need to expand your horizons.... I currently work on some stuff that could utilize all four threads on all 32 procs. You give it to me I will use it!!


Also, The thought of having to be near 3 - 4 W per core is not actually correct... Given a single chasis that would theorhetically perform like they state you could use a single 2U chasis to replace 16 current Opty chasis. If you take the requirements of dual PSUs and other PER/Chasis items you could end up with quite a savings.

You might have to get better at thermals management but even desktops are doing that now days see the simle Antec P180.
July 10, 2006 10:36:08 PM

Wolfman, You think they need all that power to run IE ;) 

I guess for version 7.0 it would not hurt ;) 

Quote:
Aye, this idea will only be usable by servers and/or real process crunching programs (IE, graphics/movie rendering and effects, HD video rendering, etc.)
July 10, 2006 10:53:14 PM

In 1997 we were looking into the IBM SP2 with 48 nodes.

Each node could hold either a memory expansion card or a CPU expansion card. The CPU expansion card would hold up to 16 CPUs and would address up to 64gb of memory....

The 8 processor Dec Alpha 8400s we were using just could not cut it anymore!


These SMP examples have been around for some time (much earlier than 1997). The difference being these are multiple CPU instances instead of cores. The applications written for multiple cpus vs multiple cores are nearly identical (with exception of possible optimizations for the specific hardware).

Just a history lesson from a not-so-old fart ;) 
July 10, 2006 11:50:46 PM

That's f*cking crazy. Dual-core processors came out about 1.5 years ago, after single core CPUs had been around for 30+ years. Now, 1.5 years after dual-core CPUs, Intel is already talking about 32 cores. Most software isn't multithreaded yet, which just shows you how much faster hardware advances than software.
July 11, 2006 12:01:45 AM

Angry,

You are possibly a teeny weenie little bit wrong.

The software is already out and most I would dare to say would even currently use 32 cores if allowed.

I have worked on a project that was using a thread pool arch. We made it configureable for the number of allowed threads within the pool.

A simple config file change and right away we could make full use this arch.
July 11, 2006 2:03:10 AM

Quote:
Angry,

You are possibly a teeny weenie little bit wrong.

The software is already out and most I would dare to say would even currently use 32 cores if allowed.

I have worked on a project that was using a thread pool arch. We made it configureable for the number of allowed threads within the pool.

A simple config file change and right away we could make full use this arch.


I stand corrected. But single core CPUs aren't going to disappear overnight. Multi-threaded software will have to be backwards compatible so that it will run on single-core CPUs, otherwise the software companies are missing out on a large part of the market.
July 11, 2006 10:05:43 AM

32 cores is AMAZING - I don't think it will result in 32x (or 16x) gains for regular users, considering that regular desktop software is still designed for single cores, but this will be a blessing for people wanting to do scientific computing on the cheap, delivering performance currently available in mid-range servers to a desktop (of course it would have have more RAM / HD space than a regular desktop). CAE and CFD analysts will definitely love this, and just think about PoVRay on 32 cores!!!
July 11, 2006 3:04:15 PM

Quote:
The key for these wet dreams is


I agree, very inappropriate and unprofessional sexual comment for an article about computers. This should be removed.

I think the term 'wet dream' has moved more into common vernacular over the years and has come to be accepted under the more colloquial meaning of 'something that is really good'. When reading the article, the term didn't stand out as out of place to me at all.

The word n*gger has become common vernacular among certain ethnic groups in the United States, but that doesn't make it appropriate for a "professional" article.

I'm just saying it's inappropriate and demeans the article. I don't use explitives during meetings because it's inappropriate, and I don't make sexual references in a professional setting. I would consider this courteous to my audience.

Perhaps society is moving faster than I am, but I don't believe sexual references, regardless of the intent, belong in a professional setting. It just leaves a bad taste in my mouth.

Good article, but again, it THG needs better editing.
July 11, 2006 3:06:56 PM

Quote:
Perhaps society is moving faster than I am, but I don't believe sexual references, regardless of the intent, belong in a professional setting. It just leaves a bad taste in my mouth.


interesting choice of words, considering the topic. :lol:  :wink:
July 11, 2006 3:37:51 PM

I think some of you are missing the point... 32 Cores is meant for servers, but I can guarantee that the home operating system will benefit from multiple cores, and if you read on I can prove it.

Just a technical overview of threads...

A hardware "thread" is, for the most part, a simple program executing commands contiguously. Windows is a mult-tasking (or multi-threaded) OS, so it takes that hardware thread and divides it into small chunks called quantums. The OS manages software threads, which is just a fancy way of distributing the processor time (or quantums) to different programs based on the program priority and state.

The problem is that while a program is running, it stores certain values in special memory locations on the processor. When the software thread changes, these memory locations (registers) need to be saved and then later restored when that software thread receives another quantum. This is called Context Switching, and this takes processor time (cycles).

When you have too many software threads running, each there is a lot of context switching occuring. This means a lot of processor time is spent managing threads, rather than executing software. This is referred to as thrashing the processor.

More cores means that the computer can run more software (or more instructions) without worrying about context switching. This has a very direct impact on performance, because less time is spent on context switching.

Games only use one thread because that's been a hardware limitation, but you'll very soon see games taking advantage of multiple hardware threads. With 32 cores, in splinter cell, each opponent might have a core dedicated to its AI, making it able to make decisions independant of the timing of rendering and other factors. Timing in game development is huge. Also consider those using Roger Wilco (VoIP) and Winamp while they game. That has a direct impact on the gaming performance, unless you have enough cores to offload the work...

Multiple cores are a great way to get linear scaling, in theory. The problem is that unless you want buy 1GB of memory for every core in your computer, there needs to be a way that the cores can share memory (and other resources). This is where the REAL bottleneck is with multiple cores, but that's another explanation entirely :) 

If you need proof that a home computer can benefit from multiple cores, create a performance log and run a game.

Go to:
CLICK -> Start
RIGHT-CLICK -> My Computer
CLICK -> Manage
CLICK -> Performance Logs and Alerts
CLICK -> Counter Logs
CLICK -> Toolbar -> Action -> New Log Settings
CLICK -> Add Counters
PRESS -> ALT+O
PRESS -> T until you see "Threads"
SELECT -> Context Switches/Second

(You can also add threads if you want, to see the number of threads running).

That will give you an idea of how more hardware threads can benefit you. In 10 years, we'll all wonder how we ever got along with only a single core...
July 11, 2006 3:40:43 PM

Quote:
Perhaps society is moving faster than I am, but I don't believe sexual references, regardless of the intent, belong in a professional setting. It just leaves a bad taste in my mouth.


interesting choice of words, considering the topic. :lol:  :wink:

LOL! Freudian slip, I guess. lol
July 11, 2006 3:51:20 PM

Quote:
ABIT IC7-MAX3 (Garbage)


do you say that because it doesn't OC very well ?

i gave up on my own Max 3. it has a 3 GHz P4 that stays at 3 GHz.

I had to RMA it 3 times. Then I still had problems with the BIOS, AGP slot, Ethernet, and drivers until BIOS revision 14 or so. I didn't even TRY to OC it. :(  I got no help from the tech support at abit, other than "Send it in again, and we'll send you another one." Then I had the northbridge fan die on me 3 times (I kept the fans when I RMA'ed :) ). Just a crappy board altogether.
July 11, 2006 4:06:30 PM

Quote:
Perhaps society is moving faster than I am, but I don't believe sexual references, regardless of the intent, belong in a professional setting. It just leaves a bad taste in my mouth.


interesting choice of words, considering the topic. :lol:  :wink:

LOL! Freudian slip, I guess. lol

>.< Gah! I was hoping I'd be the first to point out that wet dreams leave a bad taste in his mouth... man... should've read the article last night.
July 11, 2006 4:30:40 PM

I was talking to a friend recently about multiple cores and threading, and I was a little confused because he kept saying that there were so few things that could be multithreaded. I'm wondering to what extent this is true, because I can't think of many things today which can't be subdivded into independent tasks. He was convinced that the only practical purpose was so my computer could scan for viruses and malware, index files, monitor incoming network traffic, and run Explorer while I did real tasks like gaming and such.

But as far as a single task being subdivided goes... for example, if you do a multiple pass video encode, couldn't you dedicate one core to each pass, and just stagger them a bit? If you're trying to apply a filter to a picture, isn't that like applying a filter to multiple pictures at the same time? Couldn't converting a 4 minute WAV file into an MP3 be like converting 4 1-minute WAV files, or 32 7.5-second WAV files into MP3s, at the same time? Couldn't Firefox or IE use one thread to load and monitor each tab you have open?

Or does software just need to be designed more creatively? Multiple threads = multiple AI opponents with dedicated threads. But if you have only a few opponents, couldn't you make each thread become like a potential action the AI would do, "compute" i.e. estimate the result, and then pick the best choice? If you have 4 spare threads, the AI can try to extrapolate what would happen if it were to 1) stay put, 2) look for a hiding spot, 3) confront an enemy head-on, or 4) locate and covertly follow an enemy; then it could figure out which is more likely to further its goals, and carry out the action.

So which is it? Is my friend correct, that some things simply can't be subdivided, or are software designers just not thinking creatively enough?
July 11, 2006 4:54:04 PM

I'm a programmer.
I don't have problems writing code for dual-socket CPUs.
I can't imagine what problems people are having. DOOM for DOS was good in it's day, but modern software is written very differently now. Even UNIX was multi-processing 30 years ago.
In .NET 3.0, the CPU load balancing and unblocking will be more transparent.
Links available!
I also don't see why anyone will want single core (ok, maybe my Mom). Even if the software is primitive, in multi-core you can run VM-ware, and surf, and burn a DVD. and encode a video to Divx, etc, all at the same time WITHOUT LAGS.(this is an old topic).
July 11, 2006 5:02:05 PM

Yes! ;) 
July 11, 2006 5:10:23 PM

Quote:
The key for these wet dreams is


I agree, very inappropriate and unprofessional sexual comment for an article about computers. This should be removed.

I think the term 'wet dream' has moved more into common vernacular over the years and has come to be accepted under the more colloquial meaning of 'something that is really good'. When reading the article, the term didn't stand out as out of place to me at all.

The word n*gger has become common vernacular among certain ethnic groups in the United States, but that doesn't make it appropriate for a "professional" article.

I'm just saying it's inappropriate and demeans the article. I don't use explitives during meetings because it's inappropriate, and I don't make sexual references in a professional setting. I would consider this courteous to my audience.

Perhaps society is moving faster than I am, but I don't believe sexual references, regardless of the intent, belong in a professional setting. It just leaves a bad taste in my mouth.

Good article, but again, it THG needs better editing.

I see your point. Your previous example would definately be inappropriate in this (and virtually any) context. Thinking in more of the business meeting context, the term (wet dream) does seem at least a little too informal.

However, I have found that in general, articles such as this (online and off) tend to be written more informally (an generally to a majority male audience). If you have ever read any articles in a car magazine, you will find that Tom's is much more professionally written (albeit with significantly worse editing).

The only Tom's articles of late whose professionalism I find the need to question are the 'Who Designed This Crap!' series (which I admit, given the title, are supposed to be more of an editorial / shock value article).
July 11, 2006 6:36:43 PM

Quote:
I was talking to a friend recently about multiple cores and threading, and I was a little confused because he kept saying that there were so few things that could be multithreaded. I'm wondering to what extent this is true, because I can't think of many things today which can't be subdivded into independent tasks. He was convinced that the only practical purpose was so my computer could scan for viruses and malware, index files, monitor incoming network traffic, and run Explorer while I did real tasks like gaming and such.

But as far as a single task being subdivided goes... for example, if you do a multiple pass video encode, couldn't you dedicate one core to each pass, and just stagger them a bit? If you're trying to apply a filter to a picture, isn't that like applying a filter to multiple pictures at the same time? Couldn't converting a 4 minute WAV file into an MP3 be like converting 4 1-minute WAV files, or 32 7.5-second WAV files into MP3s, at the same time? Couldn't Firefox or IE use one thread to load and monitor each tab you have open?

Or does software just need to be designed more creatively? Multiple threads = multiple AI opponents with dedicated threads. But if you have only a few opponents, couldn't you make each thread become like a potential action the AI would do, "compute" i.e. estimate the result, and then pick the best choice? If you have 4 spare threads, the AI can try to extrapolate what would happen if it were to 1) stay put, 2) look for a hiding spot, 3) confront an enemy head-on, or 4) locate and covertly follow an enemy; then it could figure out which is more likely to further its goals, and carry out the action.

So which is it? Is my friend correct, that some things simply can't be subdivided, or are software designers just not thinking creatively enough?


Actually, that's incorrect. Most desktop software applications use multiple threads already: at least one for the User Interface and one for background operations. It's widely considered bad design to have your user interface performing application logic. Doing so causes your application to 'freeze' until the operation is complete. If your User Interface's thread is performing logic, then the user cannot interact with the application until it's done.

For example, if MS word only used one thread, the application would show (Not Responding) when you print, until you were done printing. Instead, one thread monitors and responds to user requests (clicks and such) while a background thread completes the print job, thus preventing the application from 'freezing' while you print.

One better example would be the spell-checker in MS word. While you type, there's a thread that constantly checks for spelling/grammar errors in the background.

More applicable to gaming, a lot of on-board network chipsets have a software TCP/IP stack, meaning when you're gaming online, your processor has to handle transposing some of the network traffic: something another core would be better suited to do.

High-level technologies, such as .NET and Java, allow for very easy multi-threading. Most desktop applications can be divided into multiuple threads. It's the Operating System's responsibility to determine on what processor the thread runs. Longhorn (Vista) is supposed to have a better way to delegate threads to multiple cores. The application also has the ability to choose what processors run what threads, but that's not common.

If you open up your Task Manager, you have at least one thread for every process in that Process list. A process is a logical unit that manages context (including threads) specific to the application. Each process will have at least 1 thread associated with it; usually more.

In short, most applications already support multi-threading. SMP for gaming and encoding is a little more difficult, but not all that hard. It's just a different way of thinking/programming.
July 11, 2006 8:33:40 PM

Curiosity question:
In what field are you actually programming?

Because your statements make me uneasy: Writing this simple text here of maybe 300Bytes requires apparently already 39 threads and 321MB of RAM on my rig (XP).
Well, 'guess with vista we're going to double even that.

So I might be tempted to call your Mom (single core) and ask her what went wrong in your childhood. 'cause back then all this was possible with a meg or two and a handful of MHz.

Seriously:
Even besides this server core, it's undeningly raining cores. But either way, besides some multi-media apps and gaming, the most common day-to-day applications themself will hardly profit: Any apps which is I/O heavy, is per se anyway limited in this serial part. So it's rather about running more stuff at once than one thing faster, isn't it?

But somehow I have the sneaky feeling that THIS message is not the message that AMD / Intel are pushing through. That starts already with the naming schemes where "two cores" is assiciated with "twice as fast".
July 11, 2006 8:51:09 PM

I think you are writing an OS question.
Yes, OSs keep getting more bloated. Like you said, it takes 400 megs and 32 threads just to write a 300 byte text message.!
I remember the Amiga which was multi-tasking (flaky) and can run with only 1 meg easy. This Amiga OS seemed fast with only a 20 meg hard-drive and a 9 Mhz 68000 CPU.
I wish I can see what that OS can do if it had modern PC hardware.!!
Maybe play Text-Invaders at 10000 fps while rendering a 3D ray-traced movie in Turbo Silver.

I had hopes of the BEos or Linux being a modern super-lean OS. But that didn't see to happen.
July 11, 2006 10:23:59 PM

> I remember the Amiga which was multi-tasking (flaky) and can run with only 1 meg easy. This
> Amiga OS seemed fast with only a 20 meg hard-drive and a 9 Mhz 68000 CPU.

Mine didn't even have a hard drive. Just two floppy drives ...

> I had hopes of the BEos or Linux being a modern super-lean OS. But that didn't see to happen.

Well, imho in a way Linux has: Just for fun I enjoyed myself a while ago setting up an apache webserver on
a sharp zaurus PDA with a lean kernel build. Worked surprisingly well ...

But back to the main topic:
From time to time I have to hack some scientific soft on a eight way opteron rig.
Mostly OpenMP / MPI stuff and although the data is multidimensional and therefore
lends itself very well for parallel processing, the gain in throwing more processors
at the problem flatens soberingly fast.
As soon as the amount of interdependencies and barrier points rises, things are quickly
kept at bay by the required comm overhead. And (again) I/O is a big problem.

This is already evident on a more and more common two core XP setup: Run Sisoft Sandra and encode
a movie together - no problem.
Run the .net-compiler and a movie together on stuff sitting on the same hard-drive: It will not matter
if it's one or two or four cores, the hard-drive will determine the limit.

So while there is quite some milage to gain from parallelism, miracles are not to be expected.

And in particular highly parallelized chips like this 32 core beast are specialists for workloads
where cartloads of independent threads get forked. A webserver dishing out servlets is certainly
such a case, but scenarios of this type the are more the exception than the rule.
July 11, 2006 10:48:18 PM

Good comments erase_me..

The filesystem is often the weakest link in any system that or a mounted optical drive. These subsystems are often so slow.

The best system I have ever worked on was an 8cpu DEC box Alpha 8400 hooked up to an EMS storage tank 1TB (huge for its time).... With 3 UWSCSI 3 interfaces to the off chasis storage using UFS and striped for performance. The database was well indexed and 8 individual queries (forked and very large) would return in no time at all. Man that was sweet. All that from a box that at the time was running 300mhz cpus.

What made it sweet was the response time/s on the data tank. We went from serialized queries 1 at a time to parallel queries 8 at a time. The improvement was tremendous, but so was the bandwidth to the data.
!