Companies like IBM and Google have leapt at the opportunity to build toy quantum computers. They’ve produced nice interfaces, so you can play quantum computing games. The access frameworks they’ve provided makes me feel like we are about to break out in useful quantum computers.
The public interface, however, hides the relatively slow progress in solving hardware problems.
The case of the vanishing qubits
To get your head around this result, we need to understand three key features of how information is stored and processed in quantum computing. Information is stored in qubits, but a qubit does not just hold a one or a zero; it is really a probability of being a one or a zero. Computations are performed by modifying the probability of a qubit being a one or a zero when it’s measured.
A second important point is that, during a computation, the qubit probabilities are all linked to each other: measuring one qubit restricts (or even reveals) the value of the linked qubits.
A third feature is that between computational operations, the probabilities do not stay constant. Instead, they are like swings, oscillating between unity (you will always measure the qubit to be one) and zero (you will always measure the qubit to be zero). Computation depends on measuring at the right time.
Since the qubits are all linked—so they can’t swing back and forth in isolation—they have to swing together, otherwise the linked probabilities between qubits will break. Once they break, you can’t compute anything anymore.
Noise also plays a role. Noise randomly pushes the swings, randomly speeding or slowing them. Each qubit slowly (or not so slowly) moves out of sync with its neighbors. I cannot, based on the amount of time the qubit has been left to swing, predict the probability of a single qubit anymore. Effectively the quantumness of the qubit has gone.
As a result of these issues, the sort of qubit that the researchers in this work used typically lasts about five microseconds. A typical operation takes around 20 nanoseconds. That implies something like less than 300 gate operations before the qubit is useless.
Rocking out to a microwave drum
To keep the qubits in sync with each other, the researchers apply a continuous driving microwave signal. The noise still drives the qubits out of sync, but the effect is much smaller for any given time. Not only that, but applying this trick allows the researchers to use other noise-reducing approaches.
When all of that was applied, the researchers found that their qubit system was stable for at least 36 microseconds. They also showed that they could keep two qubits in sync for more than 60 microseconds while performing gate operations.
The computations also proved to be reliable with high fidelities. The fidelity is the probability of a single operation going right—I perform an operation that should result in a qubit in a specific state, then I check how often I actually get that state. The success rate is the fidelity. The researchers get fidelities are quite high (over 0.97), though not high enough yet. To put it in perspective, after 10 operations, the chance that the qubit is in the target state is only about 70 percent.
The target here is to get single-operation fidelities well over 0.99. If you can do that, then you might be able to accept the occasional error. Multiple runs of the same code should allow the correct result to be determined. If that can’t be achieved, then costly error correction schemes have to be implemented. For these schemes, for every computational qubit, five to nine error correction qubits are required, which is an overhead that we would all love to avoid.
Putting it all together
The technology demonstrated in this paper is, at base, the same as used by IBM (the work presented here is not IBM’s). The architecture has quite a bit of flexibility: qubits can be connected and disconnected from each other, meaning that operations on one qubit don’t add noise to the disconnected qubits.
This research allows qubits to be kept in sync, even while they are disconnected from each other. That extends the number of operations that a quantum computer can perform and reduces the chance that the qubit readout will contain errors.
The researchers also speculate on future improvements for their own flavor of qubit. They think that with some redesign work, it might be possible to get another factor of two increase in the time that the qubits stay in sync. If they can do that, then the fidelities of single gates will get high enough to be very useful.