[ad_1]

Trinity’s quantum physicists in collaboration with IBM Dublin have successfully simulated super diffusion in a system of interacting quantum particles on a quantum computer.

This is the first step in doing highly challenging quantum transport calculations on quantum hardware and, as the hardware improves over time, such work promises to shed new light in condensed matter physics and materials science.

The work is one of the first outputs of the TCD-IBM predoctoral scholarship programme which was recently established where IBM hires PhD students as employees while being co-supervised at Trinity. The paper was published recently in leading Nature journal *NPJ Quantum Information*.

IBM is a global leader in the exciting field of quantum computation. The early stage quantum computer used in this study consists of 27 superconducting qubits (qubits are the building blocks of quantum logic) and is physically located in IBMs lab in Yorktown Heights in New York and programmed remotely from Dublin.

Quantum computing is currently one of the most exciting technologies and is expected to be edging closer towards commercial applications in the next decade. Commercial applications aside there are fascinating fundamental questions which quantum computers can help with. The team at Trinity and IBM Dublin tackled one such question concerning quantum simulation.

Explaining the significance of the work and the idea of quantum simulation in general, Trinity’s Professor John Goold, Director of the newly established Trinity Quantum Alliance, who led the research, explains:

“Generally speaking the problem of simulating the dynamics of a complex quantum system with many interacting constituents is a formidable challenge for conventional computers. Consider the 27 qubits on this particular device. In quantum mechanics the state of such a system is described mathematically by an object called a wave function. In order to use a standard computer to describe this object you require a huge number of coefficients to be stored in memory and the demands scale exponentially with the number of qubits; roughly 134 million coefficients, in the case of this simulation.

“As you grow the system to say 300 qubits you would need more coefficients than there are atoms in the observable universe to describe such a system and no classical computer will be able to exactly capture the system’s state. In other words we hit a wall when simulating quantum systems. The idea of using quantum systems to simulate quantum dynamics goes back to the American Nobel prize winning Physicist Richard Feynman who proposed that quantum systems are best simulated using quantum systems. The reason is simple — you naturally exploit the fact that the quantum computer is described by a wave function thus circumventing the need for exponential classical resources for storage of the state.”

So what exactly did the team simulate. Prof. Goold continues:

“Some of the simplest non-trivial quantum systems are spin chains. These are systems of little connected magnets called spins, which mimic more complex materials and are used to understand magnetism. We were interested in a model called the Heisenberg chain and we were particularly interested in the long-time behaviour of how spin excitations are transported across the system. In this long-time limit, quantum many-body systems enter a hydrodynamic regime and transport is described by equations that describe classical fluids.

“We were interested in a particular regime where something called super-diffusion occurs due to the underlying physics being governed by something called the Kardar-Parisi-Zhang equation. This is an equation which typically describes the stochastic growth of a surface or interface like how the height of snow grows during a snowstorm, how the stain of a coffee cup on cloth grows with time, or how a fluff fire grows. The propagation is known to give super diffusive transport. This is transport which becomes faster as you increase the system size. It is amazing that the same equations that govern these phenomena crop up in quantum dynamics and we were able to use the quantum computer to verify that. This was the main achievement of the work.”

IBM-Trinity predoctoral scholar Nathan Keenan, who programmed the device as part of the project tells us of some of the challenges to programme quantum computers.

“The biggest problem with programming quantum computers, is performing useful calculations in the presence of noise,” he said. “The operations performed at the chip-level are imperfect, and the computer is very sensitive to disturbances from its laboratory environment. As a result, you want to minimise the runtime of a useful programme, as this will shorten the time in which these errors and disturbances can occur and affect your result.”

Juan Bernabé-Moreno, Director of IBM Research UK & Ireland, said:

“IBM has a long history of advancing quantum computing technology, not only by bringing decades of research but also by providing the largest and most extensive commercial quantum programme and ecosystem. Our collaboration with Trinity College Dublin, through the MSc for Quantum Science and Technology and PhD programme, exemplifies this and I am delighted that it is already delivering promising results.”

As the world moves into a new era of quantum simulation it is reassuring to know that Trinity’s quantum physicists are at the forefront — programming the devices of the future. Quantum simulation is a central pillar of research in the newly launched Trinity Quantum Alliance founded and directed by Prof. John Goold, which has five founding industrial partners which include IBM, Microsoft, Algorithmiq, Horizon and Moodys Analytics.

[ad_2]

Source link