# From Quantum Theory to Computing Reality

*New research conclusively demonstrates that the quantum computational advantage was not mere theory, and real-life applications are almost round the corner.*

In a recently published research article, IBM scientists claimed to have successfully proved that quantum computers indeed carry greater advantages than traditional computing. It was always a theory. But now, through a real-world experiment – although on a small scale – IBM claims to have gathered conclusive proof of this superiority.

At the forefront of quantum computing, IBM is one of the few top-end players to have developed functional quantum computers. Till date, IBM quantum innovations have helped formulate new algorithms, develop condensed matter simulations and many-body systems, find new directions in quantum mechanics and particle physics –expanding the horizons of quantum science. In the world’s largest physics conference– the American Physical Society’s Meeting, March 2021 –46 of the non-IBM presentations were directly powered by quantum systems developed by IBM! This shows how IBM’s quantum research is fast-tracking scientific discoveries across domains. This recent experiment is just another feather in the tech giant’s cap.

Published in the latest issue of *Nature Physics*, the article clarifies that the experiment – based on limited space computations – attempted to find out whether current quantum devices, with all their shortcomings, can still perform a task that is beyond the capabilities of a classical computer. The IBM quantum team writes in a blog post:

*“Through our research, we’re exploring a very simple question. How does the computational power differ when a computer has access to classical scratch space versus quantum scratch space? … … Here, for the first time that we are aware of, we report a simultaneous proof and experimental verification of a new kind of quantum advantage.”*

Scientists agree that this development is significant because till now all research praised the quantum advantage from a theoretical point of view. It was expected that practical demonstrations would be possible only when we develop hardware capable of running such large-scale programs.

**The Quantum Logic**

Traditional computers interpret all data in binary units of zeros and ones. All digital technology follows this basic binary logic where 0 represents the absence of a signal or a “No”, while 1 represents the presence of a signal or a “Yes”. These two digits – called “bits” – can be arranged in an infinite combination to transmit or store any possible information.

But quantum logic surmises a third state of existence in addition to “yes” or “no”. It is a possibility in which an element can be in either of the two states – that is, a state of uncertainty. This third state was proposed by physicist Erwin Schrödinger as the Quantum State. Although it sounds alien to our day-to-day thinking, it is an accepted theory.

In computing terms, this quantum logic holds that a bit can have a third value in addition to the two absolute values of “yes” or “no”.Such a quantum bit is called a “qubit” – which can stand for 0, 1, as well as for any other value that can exist between those two. Thus, a single qubit can represent a combination of several binary bits, allowing processing of more data within the same physical space – and faster too!

So long this was all theory, but the recent IBM experiment has empirically proved this advantage.

**The IBM Experiment**

The experiment involved two limited-space circuits – one classical with only one bit, and the other quantum with one qubit available for computation and storage. According to the scientists involved, this scaling down by designing a microscopic experiment with limited memory space allowed a fair comparison between the two types of systems while computing.

The problem task for the test involved identifying the majority out of three input bits, returning zero if more than half of the bits are zero, and one if more than half of the bits are one.

The classical system was not expected to run the algorithm, as it was only armed with a single bit both for computation and storage. Even after providing enhanced capabilities with random Boolean gates, it could only perform with an 87.5% success rate.In contrast, the quantum system was supposed to succeed 100% of the time, as unlike a classical bit, a qubit can assume more than one state at a time – not only 0 and 1 – allowing access to a larger space of values.

However, current quantum hardware is not capable of achieving such ideal computational standards yet – as noise levels are still high. But in the experiment, it could still beat the classical system with a 93%success rate. The hands-on result suggests that with technology evolving by leaps and bounds, this performance gap will soon widen manifold.

As the IBM team sums up in their post: *“We show that qubits, even today’s noisy qubits, offer more value than bits as a medium of storage during computations.”*

**Other Similar Efforts**

Other research teams are also out to prove the Quantum Superiority hands on. One such team of scientists involving an UK-France collaboration has also succeeded in setting up a simple quantum photonics experiment to that end. Their work, published in *Nature Communications*, shows that it is possible for a quantum computer to verify solutions to problems classified as NP-complete, using a so-called interactive proof protocol and only minimal, unverified information about the solution.

In a parallel development, a Chinese research team announced successfully performing “boson sampling” using a quantum system. It is a task generally considered beyond classical computing. It is encouraging to note that the UK-France effort not only demonstrates quantum superiority, but also promises the quantum potentials of secure Cloud computing applications.

**Quantum Use Cases**

And that brings us to the quantum use cases. We find Airbus, Daimler, Google, IBM, Intel, Microsoft, the US Department of Energy – to name only a few – lining up behind quantum possibilities and putting in time and money. Why are big names going all out on quantum computers? Because it is part of a larger strategy on the future of Big Tech, comprising three big bets–quantum computing, Artificial Intelligence, and the Cloud. Taken together, they form the three essential pillars of an integrated technological future, of which quantum plays an essential role that was beyond conventional computing.

Quantum would be a game-changer for Artificial Intelligence and Machine Learning. Quantum speeds would allow algorithms to execute humongous data-driven calculations and leverage an overwhelming number of data dimensions in absurdly less time.

Quantum would also impact the Cloud because a physical desktop quantum computer has its own hurdles. Hence, Cloud computing would be necessary to drive the quantum revolution. As quantum computers go mainstream, they might be commercially accessed only through the Cloud – cutting out the need for complex infrastructure.

The unprecedented power of quantum computers makes them useful in many scenarios where classical computers fall far behind. Pioneer breakthroughs are expected in domains as diverse as immersive mixed-reality solutions, chemistry, bio-medical developments, astrophysics, materials, supply chain, and logistics, financial services, and a lot more. Quantum cryptography could change the dynamics of data security, making systems totally immune to wiretap or intercept. And quantum teleportation could revolutionize communication networks, as it would work on the theory of quantum entanglement – in which two quantum particles can stay connected to each other as if like an exact replica and displaying similar states. Any change of state in one is instantaneously reflected by the other particle, irrespective of medium or proximity.

The real beauty could be combining the unique capabilities of traditional and quantum computing over the Cloud to derive the best of both worlds together.