New research offers algorithm-based method to identify and control “noise” in quantum computers
While the world awaited the development of superfast quantum computers, scientists so long had been grappling with the issue of “noise”. Quantum computing is expected to handle colossal amounts of data at unimaginably fast speeds, hence eliminating any loss of data in the process is a prime concern. A recent paper has suggested possibilities that, if implemented with success, could be a ground-breaker in reducing noise.
“Noise” is inherent to any information system. It can simply be defined as: anything that stands in the way of dissemination of information with 100% efficiency. When we talk to someone standing apart, some of our voice gets lost in air either due to distance or because of other sounds in the surroundings; that loss in transmission or the interference is noise. When electricity flows through wires and cables, the conducting material blocks some of the current due to resistance; that is noise too and scientists relentlessly search for a material that could truly conduct 100% of the current.
To express the concept in simplest terms – all information systems deal with storage, retrieval and transmission of data, and any form of obstacle that reduces the output data compared to the input data, is noise. Considered to be one of the biggest obstacles in quantum computing, noise is a central problem in creating functional and practical quantum computers. It must be addressed before quantum computing technology can be presented as a marketable product.
The paper in focus has been published in the scientific journal Nature Physics. Its authors suggest a new approach to identify noise through algorithms, which could pave the way to better controlling methods. That is something the computing world is eagerly looking forward to, because quantum computers would bring a paradigm shift in the way we use technology. They will usher in fundamental changes in the process – cracking problems that are way beyond the complexity levels that supercomputers now handle. — letting people solve problems too complex for modern computers. But first, noise must be eliminated or at least reduced to an acceptable extent.
In quantum computing, noise occurs as errors get introduced during the process when the “qubits” that power quantum computers are being manipulated. The problem is proportional to system volume and gets magnified with increased qubits. Essentially, this means that the larger the system, the more noise that would occur – and as a result there will be more errors in the systems. It is obvious that noise in this context is integral to the running of the system; hence no reliable quantum system can be built unless noise is controlled.
The first step in the elimination of noise is its identification. In quantum systems, this would require observing and analysing exactly how noise is functioning across the system. Such a study could till now be carried out only on smaller systems with relatively less data. According to media reports, the new research article promises that its proposed algorithms would run such diagnostic analyses on large-scale quantum computing systems too. What’s more, it has already been run successfully on Quantum Experience – an IBM platform where researchers can have online access to IBM’sown quantum computing systems.
The new algorithm is reported to help identify how many errors are present in a quantum computing system. When run on Quantum Experience, the researchers discovered that their algorithm could correctly diagnose the noise in the system, in addition to how they happened in the first place. This is a pioneering achievement in the field of quantum computing. The paper mentions that this accomplishment: “…opens myriad opportunities for novel diagnostic tools and practical applications…”
However, the road ahead is still long. To scale-up this new technology, the researchers will have to calibrate quantum computers to eliminate all errors, or noise. But, at the same time, they’ll also have to go on correcting these errors as the system runs – so that complex calculations do not go off-track and the process keeps running. But the researchers are confident. As the paper’s lead author, Robin Harper of the University of Sydney mentioned to reporters: “The results are the first implementation of provably rigorous and scalable diagnostic algorithms capable of being run on current quantum devices and beyond.”