Why This Quantum Rush?

Why This Quantum Rush?

It is part of a larger strategic approach for the future of Big Tech – comprising quantum computing, AI, and the Cloud

The world suddenly seems to be doting on the idea of quantum computers. Big names like IBM, Google, Baidu, and Amazon are rushing in search of that Holy Grail in new-age computing. However, quantum computers still exist in a strange twilight world of neither here, nor there – and yet everywhere. The real extent of quantum computing in terms of actual applications is still not fully known, but it may be safely assumed that the world has not yet begun to come close to tapping the enormous power of quantum computing.

Take IBM for example. It is not only gearing to build up a 1000 qubits quantum computer by 2023, its next stop is a build a 1millionqubits quantum system. Work is already on at full swing. Just for the sake of perspective, the capacity of IBM’s current operational quantum computers stands at 65 qubits, while that of the closest competitor – Google – at 53 qubits. And both these machines can now perform within a few minutes computations that would take ten thousand years for a conventional processor to complete. Imagine how massive can be the capacities of a 1millionqubits machine!

Geordie Rose proposed in 2002 that the number of qubits in a quantum computer doubles every couple of years. Called “Rose’s Law”, it is the quantum equivalent of Moore’s Law which states that the number of transistors that can fit onto a circuit board doubles approximately every two years. But simple scaling-up is not the real reason why scientists are rooting for quantum machines.

In the strange world of quantum physics, more doesn’t always mean better. It’s not just about endlessly incrementing the physical number of qubits. Rather, both the number of qubits and how well they perform are crucial factors to be considers – because qubits are extremely sensitive and unpredictable in behaviour. An important consideration is how large of a circuit can be run on a hardware system before the qubits decohere and the quantum information disappears. This is a fine balance which IBM calls the “quantum volume”. It is, in fact, a nuanced metric for qubit performance.

So, if upscaling is not the core target, why is IBM going all out on quantum computers? Because it is part of a larger strategic approach aimed at the future of Big Tech. Quantum computing is just one of IBM’s three big bets for the future – the holy trinity of future technologies – comprising quantum computing, Artificial Intelligence, and the Cloud. None of these are standalone bets – but rather form the three essential pillars of an integrated technological future. The focus is on delivering the future of computation, and quantum is an unavoidable component of that future.  It plays a role that was not possible with the bits and bytes of conventional computing.

Quantum would be a game-changer for AI. For both Artificial Intelligence and Machine Learning quantum speeds will pose entirely new possibilities. Quantum versions of machine learning algorithms could carry out enormous data-driven calculations at a significantly faster rate, handle a mind-boggling number of data dimensions, and map them in the large quantum feature space in no time. Quantum entanglement could be used to determine fresh patterns that was impossible with classical computing.

Quantum would impact the Cloud too. While classical computers evolved from mainframes to minicomputers to personal computers it is unlikely that quantum computers will experience a same shift in form factor. Having a physical quantum computer on your desk is not really the aim too – mostly because to maintain quantum states, extreme cooling and other environmental factors come into play. This is where Cloud computing enters the picture. Cloud computing means that users have access to supercomputer capabilities regardless of whether they are in the same physical vicinity. It is just how quantum computing is working today. And as more powerful quantum computers are developed, these can be commercially accessed only through the Cloud – without the requirement of massive housing infrastructure.

The real beauty could be combining individual capacities of traditional and quantum computing over the Cloud to derive the best of both worlds together. For example, IBM engineer James Wootton, has exploited quantum computing to create random terrain generation within traditional computer games. This means, the game could totally reconfigure itself to an unimaginable degree each time you load it. This is the hybrid Cloud computational model in which a problem fed into a system would be analysed and distributed appropriately – such that some parts of it is handled by classical computing and the others by quantum computing, to come up with the best solution.

© 2023 Praxis. All rights reserved. | Privacy Policy
   Contact Us
Praxis Tech School
PGP in Data Science