With quantum computing promising mindboggling machine performance, can our encryption standards match up?
Today, especially faced by the vagaries of a post-COVID world, almost all communication and data transmission has been pushed to the Cloud. The unwavering reliance of global network systems on standard encryption methods to protect its users’ sensitive information on the internet cannot be stressed enough. Be it the small padlock on the address bar of your web browser, WhatsApp guaranteeing end-to-end encryption on its calls, or protected websites being continually remodified for your online purchases – ensuring secure connectivity across platforms is unequivocally crucial.
This especially holds true at a time when the US National Academies of Sciences, Engineering, and Medicine has proclaimed that the development of powerful quantum computing – capable of breaching our conventional cryptographic defences at ease– might not be very far away.
A snapshot of the encryption timeline
Back in the early 1970s, IBM pioneered the formation of a ‘crypto group’, the first-of-its-kind block cypher that would protect customer data. This was dubbed as the Data Encryption Standard (DES) and eventually adopted as the national standard for data protection in the United States. It did, however, have its fallacies and was prone to several attacks. In 1997, American network security firm RSA Security even sponsored a competition, offering cash prizes for any group that could break this form of encryption and read the messages being transmitted through it.
Not only was the DES cracked, but a custom-DES cracker was also developed at the cost of a quarter of a million dollars by the Electronic Frontier Foundation (EFF) the following year. The need to adopt a new encryption service was palpable.
In the early 2000s, DES was replaced by the Advanced Encryption Standard (AES), a royalty-free encryption standard that is still in use for a fair bit of classified US government information. Fast forward to today – a time when livelihoods are almost entirely based on Cloud computing – data security is primarily being guaranteed by encryption methods (both of the symmetric and asymmetric kind) and is continually being developed and refined in order to promote safety on the web. Now, however, with the evolution of quantum machines, the challenge is set to become rather steep.
The quantum evolution
Quantum machines, based on principles of quantum mechanics developed in the early twentieth century, are fundamentally different from the classical computing machines used today. Instead of using binary signals comprising endless combinations of 0 and 1 (OFF and ON), they use the principle of ‘superposition’ and ‘entanglement’ using quantum bits – called qubits – to execute tasks. The principle of superposition, though appearing to be counter intuitive to everyday logic, allows machines to simultaneously work in multiple states at the same time (imagine 0s and 1s coexisting, leading to simultaneous ‘on’ and ‘off’ states).
That is the mystery of quantum physics in which any entity may exist or not exist at the same given moment. The ‘entangled’ qubits will now be allowed to exist in the same quantum state, with changes to either predictably altering the state of the other. To use extremely simple terms, it has the potential to enhance computing speeds as something in the ‘on’ state will need not have to wait to reach the ‘off’ state and vice versa, as both can coexist – thus cutting down performance time. This will make the machines of the future far more powerful in several kinds of calculations, including faster decoding of the mathematical underpinnings of modern encryption.
In fact, back in 2016, the US National Institute of Standards and Technology (NIST) even launched a 6-year long competition to develop quantum-proof cryptography standards to protect the machines of the future. Having narrowed down the initial pool of contestants, one specific approach has been found to be especially prudent among the remaining ones – lattice-based cryptography.
Lattice-based cryptography –The way ahead?
Before we proceed further, let’s imagine a lattice. It is a massive grid across thousands of dimensions made up of billions of individual points. The code is formed by tracing a specific route between any two designated points on that complex grid. Hence, in order to break the code, one would have to know the exact route that needs to be followed – else risk getting caught up in a perennially convoluted maze with several trillions of possible paths.
This is, of course, far more secure an option when compared to the public-key encryption followed today, where traditional mathematics is used to encode data between two points, each having one key – one for the server and one for the user. Breaking it would only require decoding data (such as passwords, PINs, etc) at one or both ends.
While lattice-based cryptography appears to be a much more robust method of encoding data, its usability needs to be maintained. A piece of lattice-encoded medical equipment with a small amount of memory may require highly complex keys to unlock it – and hence lose its usability in real-life scenarios. Therefore, the size of the data required for decryption must be made user-friendly as well.
While quantum computing as a standard feature might still be far off, it may not be as far as we now perceive. In fact, in 2019 itself, Google announced that it had designed a 53-bit quantum computer – Sycamore – that could solve a complex math problem (something which would take a traditional computer almost 10,000 years to solve) in about 200 seconds. Though it did face its share of criticism, it essentially showed that the quantum jump is an attainable bridge that will be jumped in the near future – and not a far one.