Quantum computing is the next great frontier in human technological advancement. The transistor's revolution is plain to see, and its achievements for classical computing are everywhere: from the CPUs and GPUs that allow us to suspend disbelief, through the smartphones keeping us connected, and ultimately, the Internet: that fabric that's become an indelible element of our reality.
While the transistor allowed for the programmable automation and digitization of human work (and play), quantum computing and its transistor analog — the qubit — will open doors that were previously closed while revealing new ones that we previously had no idea were even there.
Here's an explanation of what quantum computing is, why we need it, and a high-level explanation of how it works.
What is Quantum Computing?
Quantum computing is an analog to the computing we know and love. But while computing leverages the classical transistor, quantum computing takes advantage of the world of the infinitely small — the quantum world — to run calculations on specialized hardware known as Quantum Processing Units (QPU). Qubits are the quantum equivalent of transistors. And while the latter’s development is increasingly constrained by quantum effects and difficulties in further miniaturization, quantum computing already thrives in this world.
Quantum refers to the smallest indivisible unit of any physical particle. This means quantum computing’s unit, the qubit, is usually made from single atoms or even from subatomic particles such as electrons and photons. But while transistors can only ever represent two states (either 1 or 0, which gave way to the binary world within our tech), qubits can represent all possible states: 0, 1, and all variations within the combination of both states at the same time. This ability is referred to as a superposition, one of the phenomena behind quantum computing’s prowess.
Why Do We Need Quantum Computing?
Qubits allow for much more information to be considered and processed simultaneously, opening the door to solving problems with degrees of complexity that would stall even the most powerful present – and future – supercomputers.
Problems with multiple variables such as airplane traffic control (which takes into account speed, tonnage, and the multitude of simultaneous planes, flying or not, within an airspace); sensor placement (such as the BMW Sensor Placement Challenge, which was recently solved in mere minutes by quantum); the age-old optimization problem of the traveling salesman (attempting to find the shortest route connecting multiple sale locations); and protein folding (which attempts to foresee any of trillions of ways an amino acid chain can present itself) are examples of workloads where quantum computers shine.
Quantum computing will also render all currently-used cryptographic algorithms moot – protection that would take even the most powerful supercomputers too long to break at the human time scale will take moments in quantum computers. This frames another element of the race for quantum computers – the ability to create cryptographic algorithms that can withstand them. Institutions such as the National Institute of Standards and Technology (NIST) have been putting new post-quantum solutions through their paces to find one that can guarantee security in the post-quantum future.
Materials science, chemistry, cryptography, and multivariate problem solving are quantum computing’s proverbial home. And more are sure to materialize as we grasp this technology’s capabilities.
What Is Quantum Superposition?
If you were to imagine the flip of a coin, classical computing would divide its result into a 0 or a 1 according to the flip ending in either heads or tails. In the qubit world, however, you’d be able to see both heads and tails simultaneously, as well as the different positions the coin takes while it spins before your eyes as it rotates between both outcomes.
While classical computers work with deterministic outcomes, quantum computing thus leverages the field of probabilities. This abundance of possible states allows quantum computers to process much more information than a binary system ever could.
Other important quantum computing concepts besides superposition are entanglement and quantum interference.
What Is Quantum Entanglement?
Entanglement happens when two qubits have been inextricably connected in such a way that you can’t describe the state of one of them without describing the state of the other. As a result, they’ve become a single system and influence one another — even though they are separate qubits.
Their states are correlated, meaning that according to the entanglement type, both particles can be in the same or even opposite states, but knowing the state of one allows you to know the state of the other. This happens across any distance: entangled particles don’t really have a physical limit to how far away they can be from each other. This is why Einstein called entanglement “spooky action at a distance.”
Imagine that you're watching a tennis match. The two players are correlated – the movements of one lead to a countermovement from the other. If you were to describe why tennis player A moved to one point of the court and hit the ball towards one area of its opponent’s field, you’d have to consider the previous actions of tennis player B; their current position; the quality and variables of their game, and several other factors. To describe the actions (or, in the qubit sense, the state) of one means you can’t ignore the actions (or state) of the other.
What Is Quantum Noise?
Any system that’s trying to be balanced and coherent must withstand outside interference. This is why many computer components, such as audio cards, feature EMI (ElectroMagnetic Interference) shielding, or your house has insulation that tries to keep its environment stabler than what the world actually looks like outside your windows.
In quantum computing, coherence is a much, much more fickle affair. Qubit states and qubit entanglement are especially prone to environmental interference (noise) and can crash in a microsecond (a millionth of a second). This noisiness can assume the form of radiation; temperature (which is why some qubit designs need to be cooled to near absolute zero, or −273.15 °C); activity from neighboring qubits (the same happens with how close transistors are placed to one another nowadays); and even impacts from other subatomic particles invisible to the naked eye. And these are just some of the possible causes of noise that then introduce errors into the quantum computation, compromising the results.
In classical computing, errors usually flip a bit (from 0 to 1 or vice-versa), but in quantum computing, as we’ve seen, there are many intermediate states of information. So errors can influence these states, which are orders of magnitude more than just a 1 or a 0.
This puts practical limitations on the amount of time a quantum computer’s qubits are operational, how long their entangled states last, and how accurate their results are.
More noise means that the qubit’s states can change or collapse (decohere) before a given workload is finished, generating a wrong result. Quantum computing thus tries to reduce environmental noise as much as possible by implementing error correction that checks and adapts to environmental interference or by trying to accelerate the speed at which qubits operate so that they can produce more work before the qubits’ coherence is lost.
What Are the Current Challenges in Quantum Computing?
Quantum computing research is one of the most complex topics known to humankind, placing an immediate barrier on who can pursue it. Typically, only the wealthiest institutions or Big Tech companies have dipped their toes into it in any significant manner.
Only a few scientists can (and want to) work in this field, and its infancy means significant investment in materials, iterative development, and research funding.
The field is in its early stages, too, which is a challenge (or a playground, depending on how you see it). Currently, multiple companies are following their own, disparate roads towards building a functional quantum computer. IBM has chosen the superconducting qubit as its weapon of choice; Quantum Brilliance works with diamond-based qubits that can operate at ambient temperatures; QCI has gone the Entropy Quantum Computing (EQC) route, which tries to take environmental interference into account; Xanadu’s Borealis QPU leverages photonics; Microsoft is still pursuing topological qubits that haven’t even materialized yet.
Each of these companies extolls the merits of their chosen approach – and each of them has reasons to invest in it, borne from thousands of hours of work and millions of dollars invested.
It’s important to frame this not so much as a race; it just means that there are multiple venues of exploration. But there is, in fact, a race towards additional funding and market share. The company that first breaks through towards quantum advantage — the point where a quantum computer provably outpaces any existing or future supercomputer in solving a particular problem or set of problems — will be the first to reap benefits.
And being the first to walk the next step for humanity’s computing sciences has indisputable advantages in shaping its future.
What’s the Outlook for Quantum Computing?
Currently, quantum computers are still in the Noisy Intermediate-scale Quantum Era (NISQ). Scientists are struggling to scale to higher qubit counts that are necessary to unlock more powerful quantum computers and more complex arrangements of qubits. This is mostly due to the issue of quantum interference, which we alluded to earlier. However, solving this problem is only a matter of time. Post-NISQ quantum devices will eventually come, even if the absence of a specific name for it is itself a reference to the long road ahead.
Expectations on quantum computing market growth are disparate, but most projections seem to point towards a market worth $20 billion to $30 billion by 2030. But this is an ecosystem that’s seeing daily breakthroughs; all it takes is for one of those to result in acceleration on the road towards the coveted quantum supremacy age of quantum to throw those projections on the wayside.
As the state of quantum computing currently stands, we can expect an acceleration in the pace of development and in the number of qubits being deployed in quantum processing units. IBM’s roadmap is one of the clearest – the company expects to have as many as 433 operational qubits this year through its Osprey QPU, more than triple those found in its 2021 QPU, Eagle. The company aims to have a 1,121 qubit QPU by 2023 (Condor), and projects its QPUs will house more than 1 million qubits from 2026 forward.
That said, the exact number of qubits needed to leave the NISQ era behind is unclear; different qubits have different capabilities and can produce different amounts of work. Going forward, standardization is the name of the game: IBM’s proposed CLOPS standard of quantum performance is one such example in a still nascent industry that’s trying to coalesce. Concerted industry efforts to standardize comparisons between different QPUs are also underway and are a prerequisite for the healthy future of the space.
It's a whole, wide world in the quantum computing space. And we’re just getting started.