Sign in with
Sign up | Sign in

The Need For SoC

Tom's Talks Moorestown With The Father Of Centrino
By

TH: We’ve had system-on-chip designs for many years, obviously. Why did it take Intel until now to come out with its own SoC?

TT: That’s kind of a complex question. Let’s talk about the notebook, which was my last platform before I worked on this one. The notebook platform has very little motivation to shrink in size, especially in desktop replacements. Several years ago, we were trying to get the desktop side to adopt a lot of the notebook’s capabilities. [Ed.: Presumably, this refers to the mobile-on-desktop effort back from the Core Duo days.]

But the desktop industry and users had very little motivation because of the developed component ecosystem for power delivery, heatsinks—the whole nine yards needed to build a desktop. And a similar thing has evolved around the notebook. The need for space and capabilities are very different between these platforms. When not driven by the constraints of size and capability, these platforms can use existing components and programable logic to do, for example, video decode and encode. There’s little motivation for them to move to an SoC-like environment.

But when you come to such things as handheld devices, set-top boxes, and embedded systems, all of these have size and power constraints. Constraint is the mother of necessity that drove us to designing SoCs. We needed a such-and-such size chip with certain capabilities and power—high performance CPU, memory controller, graphics controller, video controller, decoder/encoder. You have to wire all of those things up into that limited real estate. That’s what drove us into building an SoC for this class of devices—need more than anything else.

Also, I should add that this wasn’t a focus area for us 10 years ago. Phone was not an Intel priority segment until we decided to take on the battle. When we saw the phone becoming more of a handheld computer, then it became a focus segment and put us down this road.

TH: Are there limits to what makes sense to integrate onto an SoC?

TT: No. It really depends on the size. For a phone, your size is constrained by the silicon area and what will be allowed on the footprint of the phone device. If it’s a 12 x 12 mm chip, we’ll try and put as many things as possible in it.

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 26 comments.
This thread is closed for comments
  • 1 Hide
    whitecrowro , June 21, 2010 6:32 AM
    "Why are we all here today? What is the meaning of Moorestown?
    Ticky Thakkar: Our vision was to.."
    - pardon me, but all this naming sound like a Star Trek interview, on Tau Cygna (M class planet in Orion Nebula).
  • 7 Hide
    cmcghee358 , June 21, 2010 6:52 AM
    It would be nice to see Intel take a jab at discrete desktop graphics. If anything just to provide more competition for the consumer.
  • 0 Hide
    liquidsnake718 , June 21, 2010 8:14 AM
    It would be nice to see that Zune HD ver 2.0 or even 3.0 with an updated Moorestown and a better Nvidia chip than the ion or ion2, with capabilities of at least 2.0ghz and 2gb of ram all the size of the zune.... imagine with 48hours on music, and 5 hours of video, this will only get larger as time goes by.... hopefully in a year or a year and a half we can see some TRUE iphone competition now with the new windows mobile out! We just need more apps
  • 1 Hide
    Onus , June 21, 2010 1:18 PM
    It never occurred to me to want an iPhone, but I definitely see one of these in my future.
  • 2 Hide
    matt314 , June 21, 2010 1:32 PM
    cmcghee358It would be nice to see Intel take a jab at discrete desktop graphics. If anything just to provide more competition for the consumer.

    ...discrete desktop graphics is a pretty niche market. Without any experience in the field or specialized engineers, it would cost them alot of money in R&D, and they would not be able to beat ATI or nVidia (neither in performance nor sales)
  • 0 Hide
    cknobman , June 21, 2010 1:54 PM
    Maybe its just me but I read the entire thing and Mr. Shreekant (Ticky) Thakkar came off as a arrogant ********.
  • 3 Hide
    Onus , June 21, 2010 1:59 PM
    cknobmanMaybe its just me but I read the entire thing and Mr. Shreekant (Ticky) Thakkar came off as a arrogant dickhead.

    Merely disagreeing with you doesn't merit a "thumbs-down," but I didn't get that impression. Confidence, maybe; his experience no doubt backs that up, but I didn't find him arrogant. I liked how he called BS on the FUD.
  • 0 Hide
    zodiacfml , June 21, 2010 3:49 PM
    I read his comments carefully and found that those were carefully chosen words. Confidence is very much needed to get the support everyone while remaining factual.

    In summary, I expect their device to be better performing than anything else in the future at the expense of a huge and heavy battery to power the Atom and the Huge screen making use of excess performance.

    cknobmanMaybe its just me but I read the entire thing and Mr. Shreekant (Ticky) Thakkar came off as a arrogant dickhead.

  • 0 Hide
    cjl , June 21, 2010 4:11 PM
    zodiacfmlI read his comments carefully and found that those were carefully chosen words. Confidence is very much needed to get the support everyone while remaining factual.In summary, I expect their device to be better performing than anything else in the future at the expense of a huge and heavy battery to power the Atom and the Huge screen making use of excess performance.

    Did you read the article? One of the points raised was that the battery life should be just fine, contrary to many people's assumptions.
  • 1 Hide
    eyemaster , June 21, 2010 4:15 PM
    He knows his product, the targets to meet and what they have accomplished. I'm sure they experimented on competing devices too. The man knows that they have a great product in their hands right now that beats all the others. That makes him confident, not arrogant.
  • 0 Hide
    noob2222 , June 21, 2010 4:39 PM
    Quote:
    TT: We’ve added graphics capability that includes both vertex and floating point, as well as the rendering capabilities increasing. But you have to really look at system-level performance. The key thing there is whether we provide enough bandwidth capability to the graphics accelerator. Because if you don’t, most of these things will get choked right where you need the bandwidth. Balancing system throughput is key. Look at our memory subsystem and design internally. We put in a lot of energy in there to make sure we have a very effective bandwidth, not just for the CPU but also for the accelerator. Unfortunately, I can’t go into more details than that at this point.


    Otherwise "our GPU will still suck, but the CPU will be faster making the GPU seem faster"
    If they can make a phone with a sliding panel to enlarget the screen, that would be cool. Putting a laptop to your ear to make a call makes me laugh.
  • 1 Hide
    Anonymous , June 21, 2010 7:06 PM
    I'm interested in the newer Atom platform!
    Too bad Windows is dependent on the PCIE bus. Perhaps it would be possible to have a windows patch that could re-rout the PCIE bus to whatever bus the mobile device has replaced it with!
    That way you'd be able to run windows on this mobile device, and you could lower the power consumption of an Atom processor system even further!
  • 0 Hide
    Anonymous , June 21, 2010 7:07 PM
    ProDigit80Too bad Windows is dependent on the PCIE bus. Perhaps it would be possible to have a windows patch that could re-rout the PCIE bus to whatever bus the mobile device has replaced it with!

    Or perhaps a null driver.
  • 1 Hide
    JonnyDough , June 21, 2010 7:52 PM
    Quote:
    Maybe connectivity’s not available or, if you’re traveling abroad, for example, that connectivity may be very expensive. Our view is that you want to do as much as possible on the handset and then use the cloud. Connectivity is going to be king for this class of mobile devices, but you don’t want to depend on it to do your work.


    I'm surprised he didn't mention SECURITY, as this is usually a #1 issue with the idea of cloud computing.
  • 1 Hide
    ta152h , June 21, 2010 9:18 PM
    x86 isn't going anywhere on phones. It's too inefficient, and there is already an established base. They ran into the same thing with Larrabee when they tried to move x86 to GPUs.

    They'll get shut down again.
  • 1 Hide
    Anonymous , June 22, 2010 1:44 AM
    Yeah, this is what Intel does best... If you can't use your dominance to win (or at least stifle) the competition, just spin a complete loss into a victory. ARM is 100% superior in this form factor, period. x86 still uses too much power, it's only advantage is that it can run Windows. Anybody who's kept up with various Linux OS for the past couple of years realizes that Windows compatibility just isn't much of an advantage anymore.
  • 0 Hide
    amnotanoobie , June 22, 2010 2:51 AM
    Would love to see what they could do with the Atom with the lessons they learned on this one.

    With the netbook cannibalization, might it be that they have just made new customers that wouldn't have bought a full-blown notebook anyway?
  • -1 Hide
    elel , June 22, 2010 3:00 AM
    This sounds cool, but the release that I am really looking forward to is buldozer.
  • -1 Hide
    jimmysmitty , June 22, 2010 3:14 AM
    TA152Hx86 isn't going anywhere on phones. It's too inefficient, and there is already an established base. They ran into the same thing with Larrabee when they tried to move x86 to GPUs. They'll get shut down again.


    Nofrom what I can see. This CPU delivers more performance and at about the same power envelope as most UMDs out there. So whats the loss?

    And besides, while Atom is based in x86 it is nothing like standard x86 CPUs.
  • 0 Hide
    ordcestus , June 22, 2010 5:07 AM
    matt314...discrete desktop graphics is a pretty niche market. Without any experience in the field or specialized engineers, it would cost them alot of money in R&D, and they would not be able to beat ATI or nVidia (neither in performance nor sales)

    They certainly won't try again anytime soon. of course the discrete graphics market has maybe 20 years left and then they'll be like sound cards where the motherboard has an integrated one thats great already
Display more comments