If you follow IT and cybersecurity news, you’ll be familiar with mentions of quantum computing, usually followed by something about post-quantum cryptography. In fact, just recently, NIST announced the formal approval of the first set of PQC standards, which will doubtless fuel more quantum apocalypse predictions in the news. Let’s take a very high-level look at all this quantum cryptography stuff to see what the fuss is about, what it all means in practice, and who will be affected by PQC migrations.
A very brief intro to cryptography (and breaking it)
Cryptography is the foundation of data privacy, especially on the web. Seeing https://
or a padlock in your address bar is a basic indicator that your connection is secured by encryption, meaning all the data you send and receive is scrambled using a cipher that only you and the recipient can decipher. Assuming everything is set up correctly, the only way to get at the original data is to break whatever cipher is being used. And even though they don’t yet exist outside of tiny experimental systems, quantum computers may, in theory, offer a way to break several fundamental modern ciphers.
That’s where the big scary stories originate—if somebody could build a working quantum computer, they might (in theory) be able to decrypt any communications sent on the modern web. While nobody has managed to build a practically usable quantum computer, and it’s not completely certain if that’s even possible, the mere theoretical possibility was enough to start a search for encryption methods that could resist such potential quantum attacks. Why the panic, you may wonder?
Getting quantum on decryption
When you connect to a site or app over HTTPS, your browser (app, phone, car, smart TV, router, you get the picture) and the server at the other end have to securely agree on how they will encrypt their communication and what encryption key to use. After that’s decided, they both have a secret key to encrypt their messages using whatever method they’ve negotiated. This part is called symmetric encryption (because they both use the same key) and is not vulnerable to quantum attacks.
The really critical and difficult part—and also the one that’s vulnerable—is securely encrypting and exchanging that key. This is done using public-key (asymmetric) cryptography based on one of several mathematical problems known to be extremely difficult (aka impractically slow) to solve. For existing schemes like RSA or Diffie-Hellman, doing the calculations to find a single key of secure length would take thousands of years using even the most powerful supercomputer. Except these problems are only difficult for a traditional computer—not a quantum one.
For this tiny specialized subset of problems, a full-scale quantum computer could be orders of magnitude faster than a traditional one and thus potentially provide a way to break the asymmetric part of encrypted communications to grab the secret symmetric key that decrypts your data. The same principle could be used to decrypt stored data gathered in the past or even forge digital signatures, wreaking havoc across the chains of trust that underpin our entire digital world. Even if the risk is still hypothetical, it was clearly a good idea to start thinking ahead for something better.
How a quantum computer could break public-key cryptography
A traditional computer is basically billions of on/off switches doing basic arithmetic really, really fast using ones and zeros. A quantum computer is built from subsystems called qubits where instead of just being on or off, each qubit can exist in a combination of states, additionally linked to the states of all the other qubits through quantum effects.
Using an approach called Shor’s algorithm, you can program a quantum computer to do certain calculations that can be used to break public-key encryption. Assuming the quantum computer works without errors or noise (and is big enough, and exists in the first place), these calculations would be much faster than on a traditional computer because all the qubits act together to check many solutions at once rather than doing individual arithmetic operations.
NIST standards for post-quantum cryptography algorithms
Cryptography relies on the principle that a theoretical weakness today could render an algorithm practically insecure in the future. Given what was known about the susceptibility of public-key cryptography to quantum decryption, the National Institute of Standards and Technology (NIST) was given the job of coordinating work on developing and standardizing replacement algorithms that would be resistant to attacks using quantum computers.
After several years and drafts, in August 2024, NIST published the final versions of three major PQC algorithms, each becoming an official Federal Information Processing Standard (FIPS):
A fourth standard, FIPS 206, is also in the works and should be finalized towards the end of 2024.
What does PQC mean in practice?
The entire web infrastructure was built around public-key cryptography as the foundation of trust and security, so swapping out those algorithms without breaking the internet will be no small undertaking. While nobody is setting a specific date, organizations such as CISA are leading the transition toward PQC, starting with critical infrastructure.
All this will happen under the hood of existing systems, so it should not directly affect end users, but it will mean a lot of work for everyone involved in the transition. The Department of Homeland Security has laid out a roadmap for that transition, and CISA has a dedicated PQC initiative to help guide those efforts. It’s reasonable to expect that other regulatory and industry bodies will follow suit, setting long-term goals to entirely move away from potentially vulnerable public-key algorithms in favor of their quantum-resistant counterparts. Some organizations are already migrating voluntarily as a best practice.
It is clear to everyone that PQC migration is a precautionary and future-proofing measure rather than any urgent reaction to demonstrated existing threats. Cryptographic history has shown time and time again that if a theoretical weakness is found in an algorithm or its implementation, there’s a very good chance it will be practically exploited in the future. Add to that the wildcard of secret security agencies worldwide that could always be years ahead in terms of tools and resources and, suddenly, the PQC initiative makes a lot of sense as a proactive security measure, especially when it comes to protecting critical infrastructure and national secrets.
For a more detailed discussion of PQC and the practical challenges of migration, see two papers from the UK National Cyber Security Centre (NCSC): Preparing for quantum-safe cryptography and Next steps in preparing for post-quantum cryptography.