Skip to main content

PCBye: Researchers Want to Ditch the Motherboard

(Image credit: ASRock/Shutterstock)

University of California at Los Angeles researchers want to do the unthinkable: kill the motherboard. In a recent piece for IEEE Spectrum, the researchers said this act of technological matricide would enable the creation of more powerful systems that aren't constrained by the printed circuit board (PCB) used today, all thanks to a new silicon-interconnect fabric that can be used in the motherboard's stead.

The researchers, Puneet Gupta and Subramanian Iyer, said this change would enable the development of all kinds of systems. They contend that relying on PCBs makes it harder for companies to develop smaller devices like smartwatches while also inhibiting the growth of larger devices used in data centers. Their silicon-interconnect fabric is supposed to enable smaller and larger devices. They explained:

"Our research shows that the printed circuit board could be replaced with the same material that makes up the chips that are attached to it, namely silicon. Such a move would lead to smaller, lighter-weight systems for wearables and other size-constrained gadgets, and also to incredibly powerful high-performance computers that would pack dozens of servers’ worth of computing capability onto a dinner-plate-size wafer of silicon."

Gupta and Iyer also said the silicon-interconnect fabric would allow chip makers to stop relying on "the (relatively) big, complicated, and difficult-to-manufacture systems-on-chips that currently run everything from smartphones to supercomputers." Instead they would be able to "use a conglomeration of smaller, simpler-to-design, and easier-to-manufacture chiplets tightly interconnected" on their fabric.

They note that relying on chiplets instead of SoCs isn't a novel idea. Intel, Nvidia and other semiconductor companies have explored the same concept. But the researchers want their silicon-interconnect fabric to go beyond the new packaging those companies are exploring to overcome what they view as fundamental problems with PCBs: their flexibility, their reliance on soldering and their size.

So how would they address those problems? It starts with "a relatively thick (500-µm to 1-mm) silicon wafer" to which "processors, memory dies, analog and RF chiplets, voltage-regulator modules, and even passive components such as inductors and capacitors can be directly bonded." That would also allow "micrometer-scale copper pillars built onto the silicon substrate" to replace solder bumps.

Those changes would "produce copper-to-copper bonds that are far more reliable than soldered bonds, with fewer materials involved," they said. But perhaps more importantly they would mean "the chip’s I/O ports can be spaced as little as 10 µm apart instead of 500 µm" so one could "therefore pack 2,500 times as many I/O ports on the silicon die without needing the package as a space transformer."

Silicon would also be a better heat conductor than the FR-4 material currently used in PCBs, they said, allowing "up to 70 percent more" heat extraction when two heatsinks are placed on the sides of the silicon-interconnect fabric. Better heat extraction means better performing components that don't have to be artificially constrained because otherwise they'd get too hot to run safely.

Gupta and Iyer studied how their silicon-interconnect fabric could affect the size of real-world systems. They found:

"In one study of server designs, we found that using packageless processors based on Si-IF can double the performance of conventional processors because of the higher connectivity and better heat dissipation. Even better, the size of the silicon “circuit board” (for want of a better term) can be reduced from 1,000 cm2 to 400 cm2. Shrinking the system that much has real implications for data-center real estate and the amount of cooling infrastructure needed. At the other extreme, we looked at a small Internet of Things system based on an Arm microcontoller. Using Si-IF here not only shrinks the size of the board by 70 percent but also reduces its weight from 20 grams to 8 grams."

Those are just the benefits afforded to current form factors. The researchers believe silicon-interconnect fabric "should let system designers create computers that would otherwise be impossible, or at least extremely impractical," too. That's assuming development on the technology continues, of course. Right now they're addressing its potential, not promising it's ready to be used in the real world.

  • StewartHH
    With advancements in mobile technology especial with SoC's, the motherboard is almost gone. The days of desktop is number and needs to evolved or become a very niche market.

    What I think instead of expanding the cores of chip, is to make a system with pluggable cores that you add to system. So you have multiple slots for cpu and even gpu slots - and you start out with a base system with say 4 or 8 cores and if you want mode add another card in another slot. It would be really cool to keep existing generation when new generation comes out. So motherboard becomes a container of slots including a slot for IO which can be all be upgraded. Imaging buying a system and new cpu or gpu comes out and still able to use existing hardware on same system. GPU's maybe more complicated
    Reply
  • artk2219
    StewartHH said:
    With advancements in mobile technology especial with SoC's, the motherboard is almost gone. The days of desktop is number and needs to evolved or become a very niche market.

    What I think instead of expanding the cores of chip, is to make a system with pluggable cores that you add to system. So you have multiple slots for cpu and even gpu slots - and you start out with a base system with say 4 or 8 cores and if you want mode add another card in another slot. It would be really cool to keep existing generation when new generation comes out. So motherboard becomes a container of slots including a slot for IO which can be all be upgraded. Imaging buying a system and new cpu or gpu comes out and still able to use existing hardware on same system. GPU's maybe more complicated

    The blade server chassis says hullo. I agree it could be nice to bring that to a more consumer level though.
    Reply
  • MasterMadBones
    Sounds cool but the material itself is very fragile. It's also disastrous from a serviceablity point of view. I understand the motivation behind it because it can significantly improve performance and power consumption but for some markets the tradeoffs are just unacceptable.
    Reply
  • vaughn2k
    Its been used for 20years now. Everheard of SoC/CSP?
    Though it can't be implemented across all platforms due to cost, operation, application.
    You still need the motherboard, to connect everything.
    Its all about landscape.
    Reply
  • bit_user
    StewartHH said:
    With advancements in mobile technology especial with SoC's, the motherboard is almost gone.
    Yeah, I'm thinking once cell phones switch to using some form of stacked DRAM, there probably wouldn't be much left on the PCB except power and I/O. And some phones are down to just a USB port for I/O (and concept phones exist without even USB). I think ARM even has a way of getting rid of physical SIMs.

    StewartHH said:
    Imaging buying a system and new cpu or gpu comes out
    Yeah, and if you could somehow swap out the old CPU or GPU and replace them with a new one... what a concept!

    I know you wanted to add, instead of replace. Well, back in the days of multi-CPU motherboards, it was certainly possible to run a multi-CPU config with different CPUs. Operating systems didn't like it, but there's no reason it can't work. Maybe they had to be the same clock speed, but that wouldn't be so hard to address, if there'd be the demand.

    And motherboards with multiple x16 (at least physically) PCIe slots can let you plug in a newer GPU alongside your old one! Genius!
    Reply
  • bit_user
    What I'd worry about is ESD. Wouldn't you need some larger-scale components to offer ESD protection, at least at I/O ports?

    And wouldn't you need some PCB-mounted connectors to deal with the stress & strain placed on them?
    Reply
  • Giroro
    Saying a PCB should be replaced with silicon is the Electronics Engineering equivalent of saying an entire airplane should be built like it's indestructible black box (nonsense to the point I doubt that's what they actually proposed).

    Making a SoC using multiple chiplets that can be mounted on a much simpler PCB, no issue with that and it already exists... innovating on current chiplet interconnects is probably what they are actually saying.
    ...But if not do they have any concept whatsoever about how much a silicon wafer costs to produce, or how one would mount connectors for any kind of human interface/display/antennas/ethernet/ etc? How do they propose to move transmit large amounts of current to the different components, or even attach a power supply in the first place? How are you going to mount that SoC in a chassis without socketing it into something?
    We use traditional PCBs for some pretty obvious reasons: They're effective, durable, and cheap.

    Also to be pedantic, I would argue that if a current system doesn't have multiple PCBs, then it already doesn't have a motherboard.
    Reply
  • Egladios
    I am not in any way a specialist in the field. But from my experience from the single boards computers, is that if one part becomes defective, it becomes inoperable. Unless you want all consumers to be technicians. And as previous comments have pointed to, due to the wafer size it becomes easily breakable. You will limit business to chip manufacturers, and hence the monopoly. Sadly many of the board manufacturers will venture into the chip manufacture, and the main companies will end up losing market share. IBM PC went down because they wanted to do it all.
    Reply
  • bloodroses
    Egladios said:
    I am not in any way a specialist in the field. But from my experience from the single boards computers, is that if one part becomes defective, it becomes inoperable.

    Unfortunately, that's the way most electronics are going anymore. It's cheaper for them to manufacture overall and they can charge more to replace the entire unit instead of just a part of it. On a business sense, it's a win/win; unless you're Radio Shack.

    I shudder to think of the day of having to replace an entire all in one unit of a cost wise equivalent i9-7980XE+1080ti just because a cheap $0.10 realtek audio part quits....
    Reply
  • g-unit1111
    It's an interesting idea I'll give them that. But wearable tech and data center tech are definitely not the same as desktop PC tech. There's a reason why all in one PCs don't perform on the same level as a 12 core desktop PC that you build yourself. And there's a reason why a 12 core desktop PC doesn't perform the same as a 32 core data center processor.
    Reply