Intel Says Moore's Law Depends on Increasing Efficiency

Perlmutter indicated that achieving the plain milestones of computational horsepower and transistor count is not the problem engineers are facing in this decade. "While Moore’s Law continues to provide more transistors, power budgets limit our ability to use them," Perlmutter said.

If design-productivity limitations were the key problem in the 1980s, power dissipation in the 1990s, and leakage power in the 2000s, efficiency is the issue in this decade, according to the executive. Perlmutter noted that an exascale supercomputer could consume up to 1 Gigawatt (GW) using today's technologies and is not practical to be actually built. A terahertz computer could consume up to 3 KW, but may be scaled to just 20 Watt by the end of the decade, Perlmutter said.

Intel is betting on its 3D tri-gate transistors, 3D die stacking (such as Hypercube memory design), near-threshold-voltage logic, and "future heterogeneous technologies" to deliver much more efficient computing architectures. The company is also looking into fine-grain power and energy management and software that is more intelligent and self-aware to manage not just the processor, but the power consumption of an entire platform, from top to bottom.

Create a new thread in the US News comments forum about this subject
This thread is closed for comments
46 comments
    Your comment
    Top Comments
  • CaedenV
    The next 10 years will be interesting to say the least. In the '80s to mid '90s we saw computers go from useless to useful (which IBM won), then the mid '90s and '00s we saw the GHz race trying to get the most clock without burning up a CPU (which AMD won), now we are seeing the die shrink race trying to minimize the electrical movement and current used (which Intel is winning). Within 10 years I think next we will see the use of new materials and better layout designs (more SOC style design where the NB/SB/GPU/wifi are all part of the CPU, and the mobo will merely have a CPU and 'feature chip'/BIOS/UEFI), followed by stacked design and aggressive instruction set changes to remove the traditional uses of the NB and SB. After that I think we will see things move away from binary (on to trinary or even base 8 as a form of data compression), and really exotic design like those light based chips we hear about from time to time. All we know for sure is that the future will be pretty damn cool and I will eagerly await my terahertz computer that runs on 20W :)
    14
  • fazers_on_stun
    CirdecusAs far as moving away from binary computing, its going to be a while. It won't be an evolution of the computer, it will be a completely new invention. Since its creation, the computer has been based on electricity, which has only 2 states: On and Off. A computer can only do two things: Either pass current or not. All of our glorious technology and computing devices, in our cars, homes, businesses, refrigerators, etc are all based off of electricity. As long as the hardware is based on electricity, we're going to be using binary systems/Base 2.If we ever were to move from Base 2 or Binary, it wouldn't be a new computer, it would be an entirely new creation and the computer would cease to exist. We would move from electricity to another element that would have more than 2 states. Light can have multiple waves of color which could lead to some exciting new invention, but ultimately, finding something other than electricity would bring us into a completely new automated computing age.


    Trinary (3 active states) logic has been around for decades, just never extensively used for general purpose computing due to the design tools, fabs etc being optimized for binary. In fact, there was some research done showing that the optimal 'n-state' logic (most computational efficiency per transistor) is base-e (2.718, or Euler's constant). Since we cannot yet implement partial logic states, the closest to that number is 3, or trinary logic.
    14
  • Kamab
    I read more intelligent and self-aware and my mind went straight to skynet.
    13
  • Other Comments
  • rangas
    yeah but at what cost for the consumer...
    -11
  • JeTJL
    It does look to seem that way, especially with all these really efficient Arm Processors, Intel with 3D trigate, as well as AMD using the clock mesh tech in the new Piledriver cpus. Another revision of Moors Law Perhaps?

    rangasyeah but at what cost for the consumer...

    At a lower cost to consumers, the die shrinks are helping a bit with a higher volume of processors, it may cost more to check each of these new chips out, but on the power consumption side we're also saving money too.
    8
  • Kamab
    I read more intelligent and self-aware and my mind went straight to skynet.
    13