Bits and bytes would soon make way for qubits, and computing will never be the same again
Once upon a time, computers resembled dinosaurs….
Well, almost! Early-generation machines occupied entire rooms or even small buildings where you had to get out of the front door, go round the building and collect your output from the rear portion of the machine – which part could be accessed only from the back of the building. The machine itself was an assemblage of gigantic tubes and pipes, kilometres of wiring, and assorted electrical as well as electronic components no less than a Star Wars movie set. The machine huffed and puffed as it processed the inputs, almost like any industrial engine. And there were punched cards, punched tapes, magnetic tapes, floppy disks of varying diameters, extremely noisy dot-matrix printers, unending lengths of odd-sized printer paper perforated at both margins and folded accordion-style back and front – computing back then was a gruelling job indeed, despite the actual calculation being performed by the machine! And even the machine could not perform much then, compared to the humongous processing capacities we now take for granted.
Cut to modern times. As data storage devices shrink smaller and smaller in outward size each day, storage capacity and processing power go on expanding for ever and ever. Immensely more powerful are the computing powers of a mobile phone today, compared to that roomful of a computing behemoth from the past. Computing, too, has undergone unthinkable transformations. Machine calculations are no longer the end but merely a means of performing much more complex data crunching that enable data model predictions and artificial intelligence. Crunching bigger and bigger volumes of data to produce highly refined and immensely more nuanced output is the order of the day. And for that, faster the processing the better – which explains the insane rush among chip manufacturers to release version after version of processors. It’s close to saturation point and new grounds need to be broken in computational science to keep up; and this is where quantum computing steps in.
The idea behind the technology
The idea of quantum computing was around for quite some time, though; in fact ever since classical physics was rocked with the iconoclastic quantum concept around the third decade of the last century. For years it had been just a theory, till the CERN’s Large Hadron Collider could put some of its concepts to actual test by the turn of the millennium. Currently, quantum technology is gaining ground and practical applications are emerging too. As a result, industry-leading IT companies are investing in quantum algorithms big time.
Computing as we know it, handles data in binary units of zeros and ones. There can be an infinite combination of these two digits – technically known as “bits” – to convey or store any possible information. Any action conducted by any kind of digital environment follows this basic technology. In binary coding, 0 stands for the absence of something – say an electric charge, while 1 represents the presence of that element. They are absolute digits, transmitting either a “Yes” or a “No”, that are arranged in unique permutations to encode a message.
However, nothing is absolute in the world of quantum physics. It conjectures a third state, in addition to existence or non-existence – yes or no. Famously elucidated by physicist Erwin Schrödinger, it can be conceived of as a cat within a closed box. By normal logic, the animal can be in either of the two states inside the box –alive or dead. But that absolute result can be definitively confirmed only when we open the box and view it. Until then, inside the closed box, the cat is in a state of uncertainty: either dead or alive. This third state was proposed by Schrödinger as the Quantum State, in which that same cat may be theoretically considered to be both dead and alive at the same time! Apparently, it defies practical logic; but, as pure theory, the logic holds.
So, when this strange quantum logic is applied to computing, the 0 and 1 from the binary bits are no longer digits with absolute value. Here, a bit is called a “qubit” – denoting a quantum bit –and the same qubit can stand for 0, 1, as well as for any other value that can possibly exists between those two. This makes it possible for a single qubit to represent a combination of several binary bits. And if several qubits are combined, data processing capabilities can be multiplied manifold within the same physical space – and processing speed shoots up.
Breaking the barrier
Quantum computing is still a developing phenomenon, but it works! When fully functional, computers with quantum processors will be breaking the speed barriers traditional computers are now encountering. Increased computational power will enhance factoring, searching, and modelling capabilities – with immense and far-reaching consequences in domains where decisions have to be made depending on data models. These include chemistry, pharmacology, economics, genetics, zoology, anthropology, defence sciences, astronomy, meteorology – and a host of other applications too. To take a very recent development, speeding up data modelling through quantum computing could enable developing a potential vaccine within days. Traditionally, it takes years even when we use conventional computers – leave alone pre-computer era vaccines!
Such powerful processing capacities would also work wonders in cracking codes – which might not be a good thing, depending on which side of the fence you are of course! Immense processing speeds mean decoding systems encrypted with traditional coding in a jiffy.
Hence, security experts are taking pre-emptive measures by developing stronger encryption models well before quantum computing hits the market. Interestingly they are using the quirks of quantum logic itself to develop quantum encryption. In the new security model, encrypted qubits can self-destruct themselves the moment they discern that security has been compromised.
The way ahead
However fantastic these might sound, such developments are actually in progress, right now as you read this. Big players including IBM and Google are working on superconducting qubits with nearly zero resistance, and they are experimenting to increase the qubit number. IBM has reached 65 qubits against Google’s 53, and just to give you an idea: these machines can perform within a few minutes computations that would take ten thousand years for a traditional processor to run through. They are targeting to develop 1000 qubit computers in the near future – but obviously these are intended to fulfil industrial and scientific use rather than personal computing requirements.
Qubits are inherently unstable and need extreme care to hold them at certain energy states. The slightest vibration or variation in surrounding temperature can affect their performance. Hence the need to supercool them. Effort is now on to build quantum computers that can function as well at normal temperatures. If successful, this is going to be one breakthrough that will reduce both cost and space.