Intel Joins Quantum Computer Race With 17-Qubit Research Chip

Over the past few years, the quantum computing race has slowly but steadily intensified. IBM was one of the first large technology companies to work on quantum computers. Then came Google and Microsoft, and a few other startups, including the Canadian D-Wave firm. However, many may have wondered where Intel was in this race, and why isn’t the largest chip maker joining it?

Moore’s Law is starting to slow down because we now have to build transistors out of just a handful of atoms. We most likely won’t be able to build transistors that are smaller than a single atom, as that’s when we get into the quantum world. This is the reason why the death of Moore’s Law feels imminent, and why this time Moore’s Law dying is not just a false alarm. We’re already seeing the effects of this, with chip manufacturers such as Intel starting to switch to a new process node every three or even four years. Once we reach the single-atom or a small group of atoms, we may have no choice but to go deeper into the quantum world and build quantum chips instead of classical ones.

Intel doesn’t want to be left behind this quantum train, and the company seems to be working harder to develop its own quantum computers now. Intel announced that it has already developed a superconducting 17-qubit test chip with the help of QuTech, a research center in the Netherlands. The chip doesn’t seem to be too far behind Google’s own experimental 20-qubit quantum computer (which is due for another upgrade soon) or IBM’s soon to be commercially ready 17-qubit quantum computer.

Intel's Take On Quantum Computers

According to the QuTech researchers, qubits are so fragile that any unintended observation of them would cause data loss. This fragility requires that they operate at a temperature of 20 milikelvin, or 250x colder than deep space.

Intel said that it’s investigating two types of qubits right now: superconducting qubits and “spin” qubits. The spin qubits resemble electron transistors and could be built in silicon, which would allow Intel to use its existing manufacturing processes to manufacture silicon-based quantum chips, too. However, this seems be more of a research idea at this point compared to the superconducting chip with 17 qubits that Intel has already built.

Intel believes that its lead in chip manufacturing will help the company bring practical quantum computers faster to market.

The company built a new architecture with improved thermal performance, reliability, and reduced radio frequency interference between qubits. It also used a scalable interconnect scheme that allows for 10-100 times more signal in and out of a chip compared to wirebonded chips.

“Our quantum research has progressed to the point where our partner QuTech is simulating quantum algorithm workloads, and Intel is fabricating new qubit test chips on regular basis in our leading-edge manufacturing facilities,” said Dr. Michael Mayberry, corporate vice president and managing director of Intel Labs. “Intel’s expertise in fabrication, control electronics and architecture sets us apart and will serve us well as we venture into new computing paradigms from neuromorphic to quantum computing,” he added.

A Hybrid Computing Future

Even though we may be only a few years away from the first practical quantum computer that will prove itself more useful than a classical computer (supposedly when one has over 50 qubits), we won’t be able to fully replace our existing computers with quantum computers.

Quantum computers will solve many problems for humanity, but mainly of the scientific kind. They may even give machine learning a large boost in performance, if their ultra-fast probabilistic capabilities could allow for a much smaller dataset to train a neural network with high-accuracy.

For instance, an AI system wouldn’t need to train against one million pictures of cats to be able to identify a new cat with high-accuracy. It could train against only 100 or even 10 pictures of cats, before it can recognize any cat as a cat. We may not suffer from a lack of cat pictures on the internet, but there are many other areas where it’s too difficult to produce a large data set of a given object or entity.

For most other types of computing, we’ll keep using our traditional computers. However, as we stop making smaller transistors, we’ll likely see a stronger focus on brand new architectures and new materials that allow for much higher clock speeds than what silicon transistors can achieve. Intel, as well as other chip makers will likely have to compete in all of these areas to remain relevant, or at least to be able to take advantage of the new markets as they start maturing.

Lucian Armasu
Lucian Armasu is a Contributing Writer for Tom's Hardware US. He covers software news and the issues surrounding privacy and security.
  • Zaporro
    Should i upgrade to Coffe Lake or wait for Intel Qubit?
    Reply
  • hannibal
    Hmmm... If you are making weather forecast and AI programs, where there is not one right answer to the problem, go to Qubit, if you would like to play games of use office programs, take Coffee lake...

    It would be interesting to see an article about real usage cases where Qubit type quantum computers are really useful and work as intended. The whole area is somewhat mystified area at this moment.
    Reply
  • bit_user
    20262631 said:
    we may be only a few years away from the first practical quantum computer that will prove itself more useful than a classical computer (supposedly when one has over 50 qubits)
    How is D-Wave not already well past this point?

    http://www.tomshardware.com/news/google-nasa-upgrade-d-wave-2000q,33882.html

    Since you wrote both articles, please tell me what I'm missing.
    Reply
  • bit_user
    20262858 said:
    It would be interesting to see an article about real usage cases where Qubit type quantum computers are really useful and work as intended.
    Nothing interactive, for one thing. It takes a fairly long time to run a single iteration.

    They're good at solving optimization problems with lots of interrelated variables.

    The point about using them to optimize neural networks means that the deep learning revolution might get a big boost. Far more sophisticated models might be squeezed down to run on phones, robots, self-driving cars, and even IoT devices than would currently fit, today.

    Other fields it could revolutionize include materials science and microbiology (i.e. things like protein folding - no more need for Folding @ Home).
    Reply
  • vern72
    Can't wait to get my Core i99999 CPU! ;-)
    Reply
  • Upgrademe
    Pre-orders up when? :P
    Reply
  • fastcountach
    'How is D-Wave not already well past this point?

    http://www.tomshardware.com/news/google-nasa-upgrade-d-wave-2000q,33882.html



    Since you wrote both articles, please tell me what I'm missing."

    D-wave is a quantum annealing computer, not a universal quantum computer. It can only solve very specific types of optimization problems and cannot solve everything a classical computer can solve.
    Reply
  • bit_user
    20263870 said:
    D-wave is a quantum annealing computer, not a universal quantum computer.
    And these other guys aren't? Why are they still talking about qubits, then?
    Reply
  • tsnor
    20263888 said:
    20263870 said:
    D-wave is a quantum annealing computer, not a universal quantum computer.
    And these other guys aren't? Why are they still talking about qubits, then?

    "..D-Wave, the most famous quantum annealer, and universal gate quantum computing are not competitors. While they rely on the same concepts, they are useful for different tasks and different sorts of problems, while also suffering from different challenges in design and manufacturing..." https://medium.com/quantum-bits/what-s-the-difference-between-quantum-annealing-and-universal-gate-quantum-computers-c5e5099175a1

    Not the same thing.

    Reply
  • bit_user
    20264135 said:
    Not the same thing.
    Thanks. I gather they're still quite restricted in what sorts of computation they can do. That article pointed me here:

    http://math.nist.gov/quantum/zoo/

    For many things, I'd imagine we'll still be using classical CPU and GPU-type architectures. Especially if quantum gate computers continue to require impractical levels of refrigeration and EMI protection for the typical home user.
    Reply