Quantum computer breakthrough tracks qubit fluctuations in real time

Quantum computer breakthrough tracks qubit fluctuations in real time


Researchers at the Niels Bohr Institute have significantly increased how quickly changes in delicate quantum states can be detected inside a qubit. By combining commercially available hardware with new adaptive measurement techniques, the team can now observe rapid shifts in qubit behavior that were previously impossible to see.

Qubits are the fundamental units of quantum computers, which scientists hope will one day outperform today’s most powerful machines. But qubits are extremely sensitive. The materials used to build them often contain tiny defects that scientists still do not fully understand. These microscopic imperfections can shift position hundreds of times per second. As they move, they alter how quickly a qubit loses energy and with it valuable quantum information.

Until recently, standard testing methods took up to a minute to measure qubit performance. That was far too slow to capture these rapid fluctuations. Instead, researchers could only determine an average energy loss rate, masking the true and often unstable behavior of the qubit.

It is somewhat like asking a strong workhorse to pull a plow while obstacles constantly appear in its path faster than anyone can react. The animal may be capable, but unpredictable disruptions make the job much harder.

FPGA Powered Real Time Qubit Control

A research team from the Niels Bohr Institute’s Center for Quantum Devices and the Novo Nordisk Foundation Quantum Computing Programme, led by postdoctoral researcher Dr. Fabrizio Berritta, developed a real time adaptive measurement system that tracks changes in the qubit energy loss (relaxation) rate as they occur. The project involved collaboration with scientists from the Norwegian University of Science and Technology, Leiden University, and Chalmers University.

The new approach relies on a fast classical controller that updates its estimate of a qubit’s relaxation rate within milliseconds. This matches the natural speed of the fluctuations themselves, rather than lagging seconds or minutes behind as older methods did.

To achieve this, the team used a Field Programmable Gate Array (FPGA), a type of classical processor designed for extremely rapid operations. By running the experiment directly on the FPGA, they could quickly generate a “best guess” of how fast the qubit was losing energy using only a few measurements. This eliminated the need for slower data transfers to a conventional computer.

Programming FPGAs for such specialized tasks can be challenging. Even so, the researchers succeeded in updating the controller’s internal Bayesian model after every single qubit measurement. That allowed the system to continually refine its understanding of the qubit’s condition in real time.

As a result, the controller now keeps pace with the qubit’s changing environment. Measurements and adjustments happen on nearly the same timescale as the fluctuations themselves, making the system roughly one hundred times faster than previously demonstrated.

The work also revealed something new. Scientists did not previously know just how quickly fluctuations occur in superconducting qubits. These experiments have now provided that insight.

Commercial Quantum Hardware Meets Advanced Control

FPGAs have long been used in other scientific and engineering fields. In this case, the researchers used a commercially available FPGA based controller from Quantum Machines called the OPX1000. The system can be programmed in a language similar to Python, which many physicists already use, making it more accessible to research groups worldwide.

The integration of this controller with advanced quantum hardware was made possible through close collaboration between the Niels Bohr Institute research group led by Associate Professor Morten Kjaergaard and Chalmers University, where the quantum processing unit was designed and fabricated. “The controller enables very tight integration between logic, measurements and feedforward: these components made our experiment possible,” says Morten Kjærgaard.

Why Real Time Calibration Matters for Quantum Computers

Quantum technologies promise powerful new capabilities, though practical large scale quantum computers are still under development. Progress often comes incrementally, but occasionally major steps forward occur.

By uncovering these previously hidden dynamics, the findings reshape how scientists think about testing and calibrating superconducting quantum processors. With current materials and manufacturing methods, moving toward real time monitoring and adjustment appears essential for improving reliability. The results also highlight the importance of partnerships between academic research and industry, along with creative uses of available technology.

“Nowadays, in quantum processing units in general, the overall performance is not determined by the best qubits, but by the worst ones: those are the ones we need to focus on. The surprise from our work is that a ‘good’ qubit can turn into a ‘bad’ one in fractions of a second, rather than minutes or hours.

“With our algorithm, the fast control hardware can pinpoint which qubit is ‘good’ or ‘bad’ basically in real time. We can also gather useful statistics on the ‘bad` qubits in seconds instead of hours or days.

“We still cannot explain a large fraction of the fluctuations we observe. Understanding and controlling the physics behind such fluctuations in qubit properties will be necessary for scaling quantum processors to a useful size,” Fabrizio says.



Source link