Meet Moorestown: Intel's Atom Platform For The Next 10 Billion Devices

Processor Power

Intel likes to refer to Lincroft as having an “ultra-low-power,” or ULP, processing core. I’m not going to veer off into the basics of Atom microarchitecture or the differences in how it processes data compared to higher-end Intel chips. We’ve covered that business before. For now, I’ll simply note that Lincroft is a single-core part that uses Hyper-Threading to create two logical processors for the operating system. The chip supports 64-bit code and uses the same Intel Virtualization Technology (VT) as the Core 2 Duo. Lincroft features a 24K data cache and a 32K instruction cache at the L1 level. There’s 512K of L2 cache and no L3. As we discuss power and performance from here on, know that we’re talking about a 1.9 GHz Lincroft part. To the best of my knowledge, this will be the top of the Z6xx stack in the near-term.

Now, it’s important to add that the 1.9 GHz clock is a “burst mode” rate, and we need to explain this. There are four primary power states in Lincroft: Ultra-Low-Frequency Mode (ULFM), Low-Frequency Mode (LFM), High-Frequency Mode (HFM), and Burst Mode. The Lincroft model that specs 1.9 GHz is likely to spend a lot of of its operational time in a 200 MHz ULFM.

Intel Burst Performance Technology (BPT) is a bit like the Turbo Boost we’ve seen implemented on the desktop in that it provides on-demand performance when needed, and when power and performance profiles will accommodate it. In the graph shown below, you can see that the HFM is the “sustained thermal limit,” meaning the actual TDP. At no time can the platform exceed its CPU thermal junction (Tj) or external chassis (Tskin) temperature limits as measured by thermal monitors. If these are exceeded, the platform will throttle back to the LFM or ULFM “recovery points” to cool off and then remain within the HFM threshold until enough headroom reappears for another burst. Naturally, these transitions all happen within fractions of a second.

With Turbo Boost, there’s a defined, guaranteed frequency and a set specific limit. When X number of cores go idle, you know the remaining cores will jump to Y frequency, and the BIOS doesn’t know what that frequency is. With Burst Mode, though, frequencies are governed by the BIOS. In fact, as Intel puts it, “Burst Mode frequencies [can be] enumerated as P-states” by the BIOS, and multiple Burst Mode exit policies can be defined.

Another facet of Burst Mode is that it supports “race-to-halt” power profiles as driven by Operating System-directed Power Management (OSPM). Race-to-halt reflects the same concept found in server computing environments: the object of the game is to blast through work as quickly as possible in order to revert to a low power, idle state. While the burst utilizes a high-power mode, total work in CPU-bound loads gets finished in less time than if the load were to run at a “normal” speed at a standard power level, and thus net power gets saved. OSPM functionality is directed through drivers and now support power down modes while the device is still active.

OSPM works in conjunction with hardware-based power controls, acting in a sort of advisory role. Software sets power policies and constraints, but hardware ultimately does the fine-grained power management. As you might expect, power and performance needs will vary depending on the application being used, and part of OSPM involves leveraging middleware profiles based on common hardware and software activities. In the common event of multitasking, where different usage profiles might apply concurrently, hardware ultimately gets the last word.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
Comment from the forums
    Your comment
  • silverx75
    Man, and the HTC Incredible just came out....
  • yannifb
    Huh, i wonder how this will compete with Bobcat, which supposedly will have 90% of desktop chip performance according to AMD.
  • descendency
    Why isn't this a 32nm product yet? If your concern (which it would be with said devices) is power consumption, shrinking the die can only help...
  • Greg_77
    silverx75Man, and the HTC Incredible just came out....

    Man, and I just got the HTC Incredible... ;)

    And so the march of technology continues!
  • Anonymous
    well we can only wait till amd gets their ULV chips out with their on die graphics so we can get a nice comparison.
  • Chemist87
    Can it run Crysis?
  • williamvw
    descendencyWhy isn't this a 32nm product yet? If your concern (which it would be with said devices) is power consumption, shrinking the die can only help...

    Time to market. 45 nm was quicker for development and it accomplished what needed to get done at this time. That's the official answer. Unofficially, sure, we all know 32 nm will help, but this is business for consumers. Right or wrong, you don't play all of your cards right away.
  • seboj
    I've only had time to read half the article so far, but I'm excited! Good stuff, good stuff.
  • burnley14
    This is more exciting to me than the release of 6-core processors and the like because these advances produce tangible results for my daily use. Good work Intel!
  • ta152h
    Do we really need x86 plaguing phones now? Good God, why didn't they use a more efficient instruction set for this? Compatibility isn't very important with the PC, since all the software will be new anyway.

    I like the Atom, but not in this role. x86 adds inefficiencies that aren't balanced by a need for compatibility in this market.
  • liquidsnake718
    I wonder how this would stack up in terms of simple benches when compared to the atoms? Definitly for power this one is a sure winner by far but this will be interesting to see since the line between server, desktop, laptop, netbook, and mobile phone processors are getting blurred
  • anamaniac
    I'm impressed, and I hope this goes far. Sounds like some awesome performance for a x86 chip that competes to RISC chips.

    I was considering buying a Sony Satio, but now I may rethink it.
    1366x768 multi-touch S-AMOLED, magnesium case, 802.11 b/g/n, 3G/4G, miniDP, miniHDMI, miniDVI, microUSB, 64GB high quality flash memory, 12MP main camera with a 5MP front facing camera, a new turbo boost that pumps cocaine into the chip until it gets too hot when the performance is needed but puts the chip to sleep in idle, and a Linux based OS specifically tailored to the chip. Sounds like something I would pay a lot for. Complete desktop PC replacement. :)

    Don't disappoint me Intel. I was hoping for 32nm 8 core LGA 1366 chips by now when I originally bought my i7 system, and you already disappointed me.

    Now only if 5GB/month on 3G didn't cost $85/month in my area, never mind the texting/calling plan.
  • technoholic
    Intel will for sure put these advancements in their upcoming Desktop CPU families. Low power consumption + high performance anyone?
  • steddy
    I noticed that on the last page of the article there was a reference to "IA Architecture". Is that a typo, or did you mean to be redundant?
  • JohnnyLucky
    Read the whole article. Read several sections twice. It sure sounds good. Wondering what the monthly fee for service will be in 2015.
  • jesseralston
    As mentioned earlier, has developed a tight allegiance to the Linux-based MeeGo OS, formerly known as Moblin before Intel and Nokia joined hands.
    Missed something here that seems fairly critical to the sentence.
  • Snipergod87
    The next checkbox item is battery life. The reality is that we all charge our phones every night. Occasionally, some unforeseen adventure or bout of brain impairment might result in needing to stretch for three or four days, but it’s rare to need a phone’s standby battery time to last for more than 48 hours

    I charge my phone once every week, i would be pretty angry if it didnt hold a charge longer than 48 hours.
  • erloas
    I also only charge my phone once a week, if that. On the same token my phone is now 2 years old and still holds a charge for a week. A lot of people that charge their phone every day also tend to have phones that won't hold a charge longer then a day or two after a year anyway.

    I also don't see the use of all these MIDs. I hardly even take my laptop out because I have a desktop and other then movement there is nothing the laptop can do that I wouldn't rather use my desktop for.

    MIDs might be ok if they didn't cost an extra $30-50 a month to get access to the internet which I'm already paying $30-50 a month for for my general usage. They might start making sense when someone like Qwest starts included DSL and wireless together for a single reasonable monthly fee so I'm not paying twice for the same thing.

    And unless you absolutely have to know the instant you get an email, and can't go more then a few hours without updating your facebook page, I don't see a daily usage for mobile internet. I probably don't think "boy it would be nice if I could check the internet while I'm out" more then once every couple months.
  • neiroatopelcc
    Articlewill be things like gaming consoles, connected cars, or whatever, we’re still talking about multiple billions of connected handheld devices in use.

    Good luck holding a car in your hands!

    Anyhow, the article seems mighty detailed compared to what we're used to here. Usually only don writes anything this detailed.

    Nice read, though imo the first page looks very much like a bought article.
  • jecastej
    Yeah great news I think about what this all means for me! The ultramobile sector growing so fast and becoming more and more preeminent. So much excitement at your hand disposal. I don't know, call me pessimist but when looking at those charts I think the best years for desktop computing started to decline a while ago, sniff. Why, well because I see that the huge market dictates where the real money goes for development. Up to these days the desktop enjoyed most of the investment and this is because the mass market wanted faster computers for everything. And now a regular laptop is powerful enough for 90% of the task most users do and will sell 2 or 3 or more times faster. Soon smaller mobile form factor PCs will dominate and I guess my beloved desktop and workstation parts will start to cost more and be updated less frecuently. I am sitting in front of a workstation all day long and I desire a faster progression for the workstations and no any sigh of slow down.

    Anyway beside the progress in the mobile and ultramobile sector I picture in the not so distant future an ultramobile CPU with memory and graphics and storage system the size of a phone in a modular and stackable design and you will have some very serious and scalable mobile supercomputing power. But will mobile form factor CPUs ever going to surpass the need for a desktop machine? Has the computing revolution started from the bottom up and I just noticed?