Sign in with
Sign up | Sign in

Intel Says Moore's Law Depends on Increasing Efficiency

By - Source: ISSCC | B 46 comments

In a keynote given at ISSCC, Intel executive vice president and chief product officer Dadi Perlumtter provided a long term outlook of processor requirements as we are moving further into terascale computing, eventually hitting exascale systems in 2020.

Perlmutter indicated that achieving the plain milestones of computational horsepower and transistor count is not the problem engineers are facing in this decade. "While Moore’s Law continues to provide more transistors, power budgets limit our ability to use them," Perlmutter said.

If design-productivity limitations were the key problem in the 1980s, power dissipation in the 1990s, and leakage power in the 2000s, efficiency is the issue in this decade, according to the executive. Perlmutter noted that an exascale supercomputer could consume up to 1 Gigawatt (GW) using today's technologies and is not practical to be actually built. A terahertz computer could consume up to 3 KW, but may be scaled to just 20 Watt by the end of the decade, Perlmutter said.

Intel is betting on its 3D tri-gate transistors, 3D die stacking (such as Hypercube memory design), near-threshold-voltage logic, and "future heterogeneous technologies" to deliver much more efficient computing architectures. The company is also looking into fine-grain power and energy management and software that is more intelligent and self-aware to manage not just the processor, but the power consumption of an entire platform, from top to bottom.

Discuss
Ask a Category Expert

Create a new thread in the News comments forum about this subject

Example: Notebook, Android, SSD hard drive

This thread is closed for comments
Top Comments
  • 14 Hide
    fazers_on_stun , February 24, 2012 4:24 PM
    CirdecusAs far as moving away from binary computing, its going to be a while. It won't be an evolution of the computer, it will be a completely new invention. Since its creation, the computer has been based on electricity, which has only 2 states: On and Off. A computer can only do two things: Either pass current or not. All of our glorious technology and computing devices, in our cars, homes, businesses, refrigerators, etc are all based off of electricity. As long as the hardware is based on electricity, we're going to be using binary systems/Base 2.If we ever were to move from Base 2 or Binary, it wouldn't be a new computer, it would be an entirely new creation and the computer would cease to exist. We would move from electricity to another element that would have more than 2 states. Light can have multiple waves of color which could lead to some exciting new invention, but ultimately, finding something other than electricity would bring us into a completely new automated computing age.


    Trinary (3 active states) logic has been around for decades, just never extensively used for general purpose computing due to the design tools, fabs etc being optimized for binary. In fact, there was some research done showing that the optimal 'n-state' logic (most computational efficiency per transistor) is base-e (2.718, or Euler's constant). Since we cannot yet implement partial logic states, the closest to that number is 3, or trinary logic.
  • 14 Hide
    CaedenV , February 24, 2012 3:01 PM
    The next 10 years will be interesting to say the least. In the '80s to mid '90s we saw computers go from useless to useful (which IBM won), then the mid '90s and '00s we saw the GHz race trying to get the most clock without burning up a CPU (which AMD won), now we are seeing the die shrink race trying to minimize the electrical movement and current used (which Intel is winning). Within 10 years I think next we will see the use of new materials and better layout designs (more SOC style design where the NB/SB/GPU/wifi are all part of the CPU, and the mobo will merely have a CPU and 'feature chip'/BIOS/UEFI), followed by stacked design and aggressive instruction set changes to remove the traditional uses of the NB and SB. After that I think we will see things move away from binary (on to trinary or even base 8 as a form of data compression), and really exotic design like those light based chips we hear about from time to time. All we know for sure is that the future will be pretty damn cool and I will eagerly await my terahertz computer that runs on 20W :) 
  • 13 Hide
    Kamab , February 24, 2012 2:24 PM
    I read more intelligent and self-aware and my mind went straight to skynet.
Other Comments
    Display all 46 comments.
  • 8 Hide
    JeTJL , February 24, 2012 2:22 PM
    It does look to seem that way, especially with all these really efficient Arm Processors, Intel with 3D trigate, as well as AMD using the clock mesh tech in the new Piledriver cpus. Another revision of Moors Law Perhaps?

    rangasyeah but at what cost for the consumer...

    At a lower cost to consumers, the die shrinks are helping a bit with a higher volume of processors, it may cost more to check each of these new chips out, but on the power consumption side we're also saving money too.
  • 13 Hide
    Kamab , February 24, 2012 2:24 PM
    I read more intelligent and self-aware and my mind went straight to skynet.
  • -7 Hide
    mayankleoboy1 , February 24, 2012 2:26 PM
    yeah but when?
  • 10 Hide
    slicedtoad , February 24, 2012 2:53 PM
    two years ago i'd have thought: "idiot's who needs efficiency?, just use a bigger psu and cooler".

    But... sandy bridge showed what excellent efficiency results in.
  • 14 Hide
    CaedenV , February 24, 2012 3:01 PM
    The next 10 years will be interesting to say the least. In the '80s to mid '90s we saw computers go from useless to useful (which IBM won), then the mid '90s and '00s we saw the GHz race trying to get the most clock without burning up a CPU (which AMD won), now we are seeing the die shrink race trying to minimize the electrical movement and current used (which Intel is winning). Within 10 years I think next we will see the use of new materials and better layout designs (more SOC style design where the NB/SB/GPU/wifi are all part of the CPU, and the mobo will merely have a CPU and 'feature chip'/BIOS/UEFI), followed by stacked design and aggressive instruction set changes to remove the traditional uses of the NB and SB. After that I think we will see things move away from binary (on to trinary or even base 8 as a form of data compression), and really exotic design like those light based chips we hear about from time to time. All we know for sure is that the future will be pretty damn cool and I will eagerly await my terahertz computer that runs on 20W :) 
  • 10 Hide
    cirdecus , February 24, 2012 3:14 PM
    As far as moving away from binary computing, its going to be a while. It won't be an evolution of the computer, it will be a completely new invention. Since its creation, the computer has been based on electricity, which has only 2 states: On and Off. A computer can only do two things: Either pass current or not. All of our glorious technology and computing devices, in our cars, homes, businesses, refrigerators, etc are all based off of electricity. As long as the hardware is based on electricity, we're going to be using binary systems/Base 2.

    If we ever were to move from Base 2 or Binary, it wouldn't be a new computer, it would be an entirely new creation and the computer would cease to exist. We would move from electricity to another element that would have more than 2 states. Light can have multiple waves of color which could lead to some exciting new invention, but ultimately, finding something other than electricity would bring us into a completely new automated computing age.
  • 0 Hide
    eklerus , February 24, 2012 3:16 PM
    /start program
    /execute logical idea
    / modify your core program to reach a answer
    /if everything fails be afraid of GOD

    no more skynet
  • 2 Hide
    yumri4 , February 24, 2012 3:20 PM
    On the article i will be glad if Intel, AMD, VIA, IBM, MIT, and/or any other CPU maker can make a chip with integrated encryption instruction sets for AES and MD5, Wi-Fi, and Bluetooth while keeping a GPU on die as that will be the best CPU design for the feature hitting most if not all markets with the design. Even though Intel and AMD are the big 2 in CPU designs and get others for the fab of them.
    @caedenv yeah but if you have a terahertz computer right now anyways a avg person doesn't need anything above 2.5GHz so why make a 1 terhertz CPU when the vast majority of the market will see little to no difference but the cost? same with the number of cores which right now in consumer CPUs AMD is doing better but in sever CPUs Intel is doing better with their 20-core CPU but with that most things only use 2 cores but some games might use 4 but when you get to AV editing and CAD then the higher core count matters but most do not do any of that.

    @Cirdecus i really do not see us moving away from binary any time soon anyways as even if we have a light singel based computer it will most likely still be a binary based computer just a lot faster but if we do get away from the base of 2 what would we hypothetically go to instead?
  • 14 Hide
    fazers_on_stun , February 24, 2012 4:24 PM
    CirdecusAs far as moving away from binary computing, its going to be a while. It won't be an evolution of the computer, it will be a completely new invention. Since its creation, the computer has been based on electricity, which has only 2 states: On and Off. A computer can only do two things: Either pass current or not. All of our glorious technology and computing devices, in our cars, homes, businesses, refrigerators, etc are all based off of electricity. As long as the hardware is based on electricity, we're going to be using binary systems/Base 2.If we ever were to move from Base 2 or Binary, it wouldn't be a new computer, it would be an entirely new creation and the computer would cease to exist. We would move from electricity to another element that would have more than 2 states. Light can have multiple waves of color which could lead to some exciting new invention, but ultimately, finding something other than electricity would bring us into a completely new automated computing age.


    Trinary (3 active states) logic has been around for decades, just never extensively used for general purpose computing due to the design tools, fabs etc being optimized for binary. In fact, there was some research done showing that the optimal 'n-state' logic (most computational efficiency per transistor) is base-e (2.718, or Euler's constant). Since we cannot yet implement partial logic states, the closest to that number is 3, or trinary logic.
  • 3 Hide
    jasonw223 , February 24, 2012 4:52 PM
    I'd be OK with buying 3kW worth of power supplies to run a terahertz computer now...
  • 3 Hide
    lamorpa , February 24, 2012 4:56 PM
    mayankleoboy1yeah but when?

    You may find some answers to this question in an article titled, "Intel Says Moore's Law Depends on Increasing Efficiency" which you can find 8 inches above where you posted your comment.
  • 7 Hide
    lamorpa , February 24, 2012 5:02 PM
    CirdecusSince its creation, the computer has been based on electricity, which has only 2 states: On and Off. A computer can only do two things: Either pass current or not.

    Hilarius. The most fundamentally idiotic comment of the week! I'm still not sure if this is a joke comment or not.
  • -7 Hide
    lamorpa , February 24, 2012 5:21 PM
    Since creation, the puter has be base on that lectric, witch has only 2 straits: wax on n wax off. A puter only do two things: Either Ralph Macchio or Mr. Miyagi. All of our inglorious techo and putings, in r trucks, sheds, frigerators, shotguns, etc all be base of that lectric. As long as the puters is based on that lectric, we're going to be using flynary system/2 chopstick.
  • 0 Hide
    Pherule , February 24, 2012 6:32 PM
    Can you imagine transfer rates on a Trinary or 8 base system? I thought up the idea of an 8 base a long time ago myself, and nobody seemed interested. I guess some people will always be trapped in history, never wanting to make progress.

    You could probably download a 1 "GB" file in 1 second on dial up, using 8 base. Delicious.
  • 5 Hide
    rosen380 , February 24, 2012 6:48 PM
    "You could probably download a 1 "GB" file in 1 second on dial up, using 8 base. Delicious."

    Three base-2 bits can represent eight states, just like one base-8 bit. Assuming that the computer can process a base-8 bit as fast as a base-2 bit, then I'd think we're talking about a 3x improvement in performance, not a ~20000x improvement that you are suggesting with your example.

    Since I'm pretty sure the cpu will need more time to process one base-8 bit than one base-2 bit, I doubt that 3x increase would ever be seen...
  • 1 Hide
    lamorpa , February 24, 2012 6:57 PM
    PheruleCan you imagine transfer rates on a Trinary or 8 base system? I thought up the idea of an 8 base a long time ago myself, and nobody seemed interested. I guess some people will always be trapped in history, never wanting to make progress.You could probably download a 1 "GB" file in 1 second on dial up, using 8 base. Delicious.

    You actually have no idea how deep your fundamental misunderstanding of this topic is. "thought up the idea of an 8 base"! "dial up, using 8 base" Classic! This might be better than the "electricity has only 2 states" guy.
  • 4 Hide
    PreferLinux , February 24, 2012 7:56 PM
    CirdecusAs far as moving away from binary computing, its going to be a while. It won't be an evolution of the computer, it will be a completely new invention. Since its creation, the computer has been based on electricity, which has only 2 states: On and Off. A computer can only do two things: Either pass current or not. All of our glorious technology and computing devices, in our cars, homes, businesses, refrigerators, etc are all based off of electricity. As long as the hardware is based on electricity, we're going to be using binary systems/Base 2.If we ever were to move from Base 2 or Binary, it wouldn't be a new computer, it would be an entirely new creation and the computer would cease to exist. We would move from electricity to another element that would have more than 2 states. Light can have multiple waves of color which could lead to some exciting new invention, but ultimately, finding something other than electricity would bring us into a completely new automated computing age.

    Have you ever used a stereo? Or what about a computer monitor? Or a microwave or electric stove? What about hot water from an electric cylinder? All of them use electricity, and far more than two states – most of them use it with continuous variation, and the computer monitor will have at least 64 levels/states for each sub-pixel.
  • 1 Hide
    eddieroolz , February 24, 2012 8:56 PM
    This theme has been quite obvious from even years ago, especially with Tom's articles. Whereas people were hell bent on achieving the highest possible clocks in the 90's and early 2000's we are now pursuing to improve our efficiency.
  • 2 Hide
    darklightdarklight , February 24, 2012 10:22 PM
    KamabI read more intelligent and self-aware and my mind went straight to skynet.


    Yea, I stopped reading at "self-aware", went down into my basement bunker, checked the expiration dates of the canned beans, and re-oiled the RPG launcher. When I got back to my computer, I turned up User Account Controls to MAX. Windows ain't gonna do shit without my authorization!
Display more comments