Intel Says Moore's Law Depends on Increasing Efficiency

Perlmutter indicated that achieving the plain milestones of computational horsepower and transistor count is not the problem engineers are facing in this decade. "While Moore’s Law continues to provide more transistors, power budgets limit our ability to use them," Perlmutter said.

If design-productivity limitations were the key problem in the 1980s, power dissipation in the 1990s, and leakage power in the 2000s, efficiency is the issue in this decade, according to the executive. Perlmutter noted that an exascale supercomputer could consume up to 1 Gigawatt (GW) using today's technologies and is not practical to be actually built. A terahertz computer could consume up to 3 KW, but may be scaled to just 20 Watt by the end of the decade, Perlmutter said.

Intel is betting on its 3D tri-gate transistors, 3D die stacking (such as Hypercube memory design), near-threshold-voltage logic, and "future heterogeneous technologies" to deliver much more efficient computing architectures. The company is also looking into fine-grain power and energy management and software that is more intelligent and self-aware to manage not just the processor, but the power consumption of an entire platform, from top to bottom.

  • rangas
    yeah but at what cost for the consumer...
    Reply
  • JeTJL
    It does look to seem that way, especially with all these really efficient Arm Processors, Intel with 3D trigate, as well as AMD using the clock mesh tech in the new Piledriver cpus. Another revision of Moors Law Perhaps?

    rangasyeah but at what cost for the consumer...At a lower cost to consumers, the die shrinks are helping a bit with a higher volume of processors, it may cost more to check each of these new chips out, but on the power consumption side we're also saving money too.
    Reply
  • Kamab
    I read more intelligent and self-aware and my mind went straight to skynet.
    Reply
  • mayankleoboy1
    yeah but when?
    Reply
  • slicedtoad
    two years ago i'd have thought: "idiot's who needs efficiency?, just use a bigger psu and cooler".

    But... sandy bridge showed what excellent efficiency results in.
    Reply
  • CaedenV
    The next 10 years will be interesting to say the least. In the '80s to mid '90s we saw computers go from useless to useful (which IBM won), then the mid '90s and '00s we saw the GHz race trying to get the most clock without burning up a CPU (which AMD won), now we are seeing the die shrink race trying to minimize the electrical movement and current used (which Intel is winning). Within 10 years I think next we will see the use of new materials and better layout designs (more SOC style design where the NB/SB/GPU/wifi are all part of the CPU, and the mobo will merely have a CPU and 'feature chip'/BIOS/UEFI), followed by stacked design and aggressive instruction set changes to remove the traditional uses of the NB and SB. After that I think we will see things move away from binary (on to trinary or even base 8 as a form of data compression), and really exotic design like those light based chips we hear about from time to time. All we know for sure is that the future will be pretty damn cool and I will eagerly await my terahertz computer that runs on 20W :)
    Reply
  • cirdecus
    As far as moving away from binary computing, its going to be a while. It won't be an evolution of the computer, it will be a completely new invention. Since its creation, the computer has been based on electricity, which has only 2 states: On and Off. A computer can only do two things: Either pass current or not. All of our glorious technology and computing devices, in our cars, homes, businesses, refrigerators, etc are all based off of electricity. As long as the hardware is based on electricity, we're going to be using binary systems/Base 2.

    If we ever were to move from Base 2 or Binary, it wouldn't be a new computer, it would be an entirely new creation and the computer would cease to exist. We would move from electricity to another element that would have more than 2 states. Light can have multiple waves of color which could lead to some exciting new invention, but ultimately, finding something other than electricity would bring us into a completely new automated computing age.
    Reply
  • eklerus
    /start program
    /execute logical idea
    / modify your core program to reach a answer
    /if everything fails be afraid of GOD

    no more skynet
    Reply
  • yumri4
    On the article i will be glad if Intel, AMD, VIA, IBM, MIT, and/or any other CPU maker can make a chip with integrated encryption instruction sets for AES and MD5, Wi-Fi, and Bluetooth while keeping a GPU on die as that will be the best CPU design for the feature hitting most if not all markets with the design. Even though Intel and AMD are the big 2 in CPU designs and get others for the fab of them.
    @caedenv yeah but if you have a terahertz computer right now anyways a avg person doesn't need anything above 2.5GHz so why make a 1 terhertz CPU when the vast majority of the market will see little to no difference but the cost? same with the number of cores which right now in consumer CPUs AMD is doing better but in sever CPUs Intel is doing better with their 20-core CPU but with that most things only use 2 cores but some games might use 4 but when you get to AV editing and CAD then the higher core count matters but most do not do any of that.

    @Cirdecus i really do not see us moving away from binary any time soon anyways as even if we have a light singel based computer it will most likely still be a binary based computer just a lot faster but if we do get away from the base of 2 what would we hypothetically go to instead?
    Reply
  • fazers_on_stun
    CirdecusAs far as moving away from binary computing, its going to be a while. It won't be an evolution of the computer, it will be a completely new invention. Since its creation, the computer has been based on electricity, which has only 2 states: On and Off. A computer can only do two things: Either pass current or not. All of our glorious technology and computing devices, in our cars, homes, businesses, refrigerators, etc are all based off of electricity. As long as the hardware is based on electricity, we're going to be using binary systems/Base 2.If we ever were to move from Base 2 or Binary, it wouldn't be a new computer, it would be an entirely new creation and the computer would cease to exist. We would move from electricity to another element that would have more than 2 states. Light can have multiple waves of color which could lead to some exciting new invention, but ultimately, finding something other than electricity would bring us into a completely new automated computing age.
    Trinary (3 active states) logic has been around for decades, just never extensively used for general purpose computing due to the design tools, fabs etc being optimized for binary. In fact, there was some research done showing that the optimal 'n-state' logic (most computational efficiency per transistor) is base-e (2.718, or Euler's constant). Since we cannot yet implement partial logic states, the closest to that number is 3, or trinary logic.
    Reply