Scientists Propose Climate Supercomputer With 20 Million CPUs

 

Berkeley (CA) - In supercomputing, the sky is the limit, literally. In an effort to enable more credible global climate change predictions, researchers from UC Berkeley believe that the way to go is a new kind of cloud supercomputer that includes 20 million processors delivering a peak performance of 200 PFlops to simulate 1-km scale climate models. At the same time, this proposed system would not require a power plant all for itself. How that is possible you ask? These guys are looking into ultra-efficient embedded RISC CPUs.

We are just about ready to transition from the Gigaflop into the Petaflop era and today we heard about a new proposal from UC Berkeley and Tensilica that, at least on paper, could put supercomputer development into warp speed. In a dramatic departure from current supercomputer architectures and upcoming hybrid systems, this proposed system would rely on embedded processors with minimal power consumption.

The researchers believe that 20 million Tensilica RISC processors would deliver at least 10 PFlops of sustained performance, while topping out at about 200 PFlops. The power consumption of such a system is estimated at about 4 Mega Watts and the construction and typical operation cost at about $75 million. A 200 PFlops system that is built on today’s common architecture could cost up to $1 billion and consume 200 Mega Watts - which is the equivalent of what a city with 100,000 people consumes.

In comparison, the currently fastest supercomputer tops out at 576 GFlops.

There is little performance information about Tensilica’s Xtensa LX extensible processors, which could allow us to compare them to what your typical server processor offers. What we know, however, is that Tensilica builds its processors in 90 nm and 130 nm processes and runs the chips at clock speeds between 150 and 450 MHz. The power consumption is "less than 0.1 mWatt per MHz", which puts such a processor at a power consumption of about 45 mWatts in a worst case scenario, according to the manufacturer.

So, what would a 200 PFlop system be able to accomplish?

According to the researchers, such a computer would make global climate change predictions more understandable and more credible. Climate models are created today largely by using historical data of rainfall, hurricanes, sea surface temperatures and carbon dioxide in the atmosphere. Accurate cloud simulations are much more complex, however, and well within the reach of current supercomputers. Past cloud models, the researcher claim, lack the details that could improve the accuracy of climate predictions: The required accuracy can only be provided by a system that can cope with 1 km-scale models that provide rich details not available in existing models.

To develop such a 1-km cloud model, the scientists said they will need a supercomputer that is 1000 times more powerful than what is available today, the researchers say. And the proposed 200 PFlops Tensilica system could put them into that range, at least in theory.

However, the UC Berkeley researchers claim that this "climate computer" is not just a concept: Michael Wehner, Lenny Oliker and John Shalf said they have been working with scientists from Colorado State University to build a prototype system in order to run a new global atmospheric model developed at Colorado State University. "What we have demonstrated is that in the exascale computing regime, it makes more sense to target machine design for specific applications," Wehner said. "It will be impractical from a cost and power perspective to build general-purpose machines like today’s supercomputers."

  • HP Labs has done some research on a supercomputer that could model the entire climate of the world. If would be the size of Paris, would boil the Seine for cooling and would become the single most significant factor changing the climate...
    Reply
  • Mr_Man
    I was thinking something similar, Mary. Even these efficient processors in question would consume so much power and produce so much heat, it really just defeats the purpose.
    I will say that global warming theories could use some credibility, seeing as the data they use is mostly old. I just don't think one massive computer is the way to do it. Why not a SETI@home type of thing? Many people leave their computers on all day, anyway. Why not put their processors to use?
    Reply
  • what about teraflops.... get the steps right... it's not giga to peta... its giga to tera to peta... anyway, I suck at english but seriously they need to learn to edit
    Reply
  • JAYDEEJOHN
    Im thinking these scientists have too much money, I mean, whos throwing all this dough at em? Networking with what we already have isnt good enough? Or is it, someone just wants to sit atop all that power. Bigger is better, but cant we use what we have to check the motion of the ocean?
    Reply
  • DXRick
    So, all they need is the hardware to predict the weather? They have already designed the code that would be executing on those 20m processors and just need someone to make the hardware?

    My BS alarm is going off.
    Reply
  • werepossum
    Well, 2007 was unusually cool, a 0.7C degree drop from 2006, and 2008 looks to be more of the same. I saw a report last week that claimed we are in for a decade of cooling, but global warming is still valid. So, how many teraflops do you need to predict stuff that has already happened, anyway?

    Not to mention that any computer rated in petaflops is clearly going to tell us global warming is caused by evil meat-eating people.
    Reply