Over the past few years, the quantum computing race has slowly but steadily intensified. IBM was one of the first large technology companies to work on quantum computers. Then came Google and Microsoft, and a few other startups, including the Canadian D-Wave firm. However, many may have wondered where Intel was in this race, and why isn’t the largest chip maker joining it?
Moore’s Law is starting to slow down because we now have to build transistors out of just a handful of atoms. We most likely won’t be able to build transistors that are smaller than a single atom, as that’s when we get into the quantum world. This is the reason why the death of Moore’s Law feels imminent, and why this time Moore’s Law dying is not just a false alarm. We’re already seeing the effects of this, with chip manufacturers such as Intel starting to switch to a new process node every three or even four years. Once we reach the single-atom or a small group of atoms, we may have no choice but to go deeper into the quantum world and build quantum chips instead of classical ones.
Intel doesn’t want to be left behind this quantum train, and the company seems to be working harder to develop its own quantum computers now. Intel announced that it has already developed a superconducting 17-qubit test chip with the help of QuTech, a research center in the Netherlands. The chip doesn’t seem to be too far behind Google’s own experimental 20-qubit quantum computer (which is due for another upgrade soon) or IBM’s soon to be commercially ready 17-qubit quantum computer.
Intel's Take On Quantum Computers
According to the QuTech researchers, qubits are so fragile that any unintended observation of them would cause data loss. This fragility requires that they operate at a temperature of 20 milikelvin, or 250x colder than deep space.
Intel said that it’s investigating two types of qubits right now: superconducting qubits and “spin” qubits. The spin qubits resemble electron transistors and could be built in silicon, which would allow Intel to use its existing manufacturing processes to manufacture silicon-based quantum chips, too. However, this seems be more of a research idea at this point compared to the superconducting chip with 17 qubits that Intel has already built.
Intel believes that its lead in chip manufacturing will help the company bring practical quantum computers faster to market.
The company built a new architecture with improved thermal performance, reliability, and reduced radio frequency interference between qubits. It also used a scalable interconnect scheme that allows for 10-100 times more signal in and out of a chip compared to wirebonded chips.
“Our quantum research has progressed to the point where our partner QuTech is simulating quantum algorithm workloads, and Intel is fabricating new qubit test chips on regular basis in our leading-edge manufacturing facilities,” said Dr. Michael Mayberry, corporate vice president and managing director of Intel Labs. “Intel’s expertise in fabrication, control electronics and architecture sets us apart and will serve us well as we venture into new computing paradigms from neuromorphic to quantum computing,” he added.
A Hybrid Computing Future
Even though we may be only a few years away from the first practical quantum computer that will prove itself more useful than a classical computer (supposedly when one has over 50 qubits), we won’t be able to fully replace our existing computers with quantum computers.
Quantum computers will solve many problems for humanity, but mainly of the scientific kind. They may even give machine learning a large boost in performance, if their ultra-fast probabilistic capabilities could allow for a much smaller dataset to train a neural network with high-accuracy.
For instance, an AI system wouldn’t need to train against one million pictures of cats to be able to identify a new cat with high-accuracy. It could train against only 100 or even 10 pictures of cats, before it can recognize any cat as a cat. We may not suffer from a lack of cat pictures on the internet, but there are many other areas where it’s too difficult to produce a large data set of a given object or entity.
For most other types of computing, we’ll keep using our traditional computers. However, as we stop making smaller transistors, we’ll likely see a stronger focus on brand new architectures and new materials that allow for much higher clock speeds than what silicon transistors can achieve. Intel, as well as other chip makers will likely have to compete in all of these areas to remain relevant, or at least to be able to take advantage of the new markets as they start maturing.
It would be interesting to see an article about real usage cases where Qubit type quantum computers are really useful and work as intended. The whole area is somewhat mystified area at this moment.
Since you wrote both articles, please tell me what I'm missing.
They're good at solving optimization problems with lots of interrelated variables.
The point about using them to optimize neural networks means that the deep learning revolution might get a big boost. Far more sophisticated models might be squeezed down to run on phones, robots, self-driving cars, and even IoT devices than would currently fit, today.
Other fields it could revolutionize include materials science and microbiology (i.e. things like protein folding - no more need for Folding @ Home).
Since you wrote both articles, please tell me what I'm missing."
D-wave is a quantum annealing computer, not a universal quantum computer. It can only solve very specific types of optimization problems and cannot solve everything a classical computer can solve.
"..D-Wave, the most famous quantum annealer, and universal gate quantum computing are not competitors. While they rely on the same concepts, they are useful for different tasks and different sorts of problems, while also suffering from different challenges in design and manufacturing..." https://medium.com/quantum-bits/what-s-the-difference-between-quantum-annealing-and-universal-gate-quantum-computers-c5e5099175a1
Not the same thing.
For many things, I'd imagine we'll still be using classical CPU and GPU-type architectures. Especially if quantum gate computers continue to require impractical levels of refrigeration and EMI protection for the typical home user.