A persistent quantum computing error finally explained

 

Scientists have discovered the cause of a persistent glitch that continues to disrupt superconducting quantum computers, even when they have built-in defenses. For all their advanced hardware, superconducting quantum computers are vulnerable to errors caused by ionizing radiation from space or the environment. Radiation particles interfere with the chip substrate (the silicon base the processor is built on), which leads to the creation of rogue particles (quasiparticles) that disrupt the qubits, the basic units of quantum computers.

A defense with a flaw

To protect against this, scientists developed a technique called gap engineering. This involves creating an energy barrier in the superconducting material of the qubits, making it harder for these particles to reach sensitive parts of the device.

However, it is not foolproof. Even with this defense, radiation can still cause sudden widespread errors affecting many qubits at once (error bursts). But it was not clear why.

To get to the bottom of this mystery, Vladislav Kurilovich and his colleagues at Google Quantum AI in California developed a measurement protocol to monitor qubits. Details of their work are published in a paper in the journal Physical Review X.

They used a 72-qubit processor (Willow processor) to perform rapid, repetitive measurements on qubits every few microseconds. They were aiming to catch the error bursts as they were happening.

A new type of error

The team discovered that even if quasiparticles cannot tunnel through the energy barriers, they still cause problems in another way. Namely, by causing the qubits to change their frequency by up to 3 MHz. The knock-on effect is that qubits lose synchronization, meaning they no longer match the frequency of the microwave pulses used to control them. Therefore, they accumulate phase errors, which are incorrect shifts in the quantum state.

"We have uncovered a new type of correlated error caused by ionizing radiation impacts, namely, correlated phase errors," wrote the study authors in their paper. This explains why previous attempts to detect the error hit a wall. "Correlated phase error bursts provide a plausible explanation for the origin of the repetition code LER floor observed in Ref. [20]."

Here, the team is referring to a previous Google experiment where they tried to correct the error but ran into an LER (logical error rate) floor, a point where the computer stops getting better, no matter how much you try to fix it.

In addition to identifying the problem, the Google researchers developed a mitigation strategy. They used echo pulses, additional control operations that help cancel unwanted phase shifts, making the system less sensitive to errors.

Comments