Last week, we wondered if early reports indicating that NIST would announce the winners of its post-quantum cryptography competition would come to fruition anytime soon. Happily, they have. After an evaluation process that began with a call for nominations in 2016 and culled dozens of potential post-quantum (PQ) cryptographic algorithms down to seven finalists and eight alternates, NIST finally announced which algorithms will advance to standardization. Drumroll please …

And The Winners Are …

NIST selected one algorithm to standardize for encryption/key establishment and three algorithms for digital signatures. The encryption algorithm is called CRYSTALS-Kyber; the three digital signature algorithms are CRYSTALS-Dilithium, FALCON, and SPHINCS+. CRYSTALS-Kyber, CRYSTALS-Dilithium, and FALCON are all lattice-based cryptosystems, a type of cryptography based upon the difficulty of finding a particular vector (often the smallest) in a large set of vectors (known as a lattice). Lattice-based cryptosystems have been around since the late 1990s, though early versions had to be tweaked to keep up with cryptanalysis.

NIST noted that they selected CRYSTALS-Kyber and CRYSTALS-Dilithium for security and performance but that FALCON’s smaller signature size was a benefit — smaller signatures are useful for lightweight and low-power devices. SPHINCS+ was the only non-lattice-based cryptosystem selected; NIST noted that they chose SPHINCS+ to “avoid relying on the security of lattices for signatures.”

But We’re Not Done Yet

In their announcement, NIST also selected four algorithms for a fourth round: BIKE, Classic McEliece, HQC, and SIKE. All four are encryption/key establishment algorithms like CRYSTALS-Kyber. During this fourth round, teams can submit updated specifications for those algorithms, which NIST will evaluate. NIST stated that it plans to standardize at least one additional encryption/key establishment algorithm at the end of round four. Each of the four algorithms is a departure from CRYSTALS-Kyber — none rely on lattice-based cryptography — but all have pros and cons. Some are highly performant but need more time for mathematicians to evaluate the security claims. Others are considered highly secure but may not be small or performant enough to address core use cases.

In addition, NIST opened a new call for proposals for PQ digital signature algorithms, particularly those with short signatures and fast verification. In the announcement, NIST noted, “NIST is primarily looking to diversify its signature portfolio, so signature schemes that are not based on structured lattices are of greatest interest.”

Why is NIST requesting additional digital signature schemes if they’ve already standardized three? Speed and size are issues, and NIST may be actively looking for alternatives that will perform better in situations where memory, processing, or power are limited. There is also the lattice dependency. Lattice-based schemes have been deemed most secure and performant in the evaluation rounds to date, but we at Forrester speculate that NIST may be hedging their bets and looking for alternatives in case advances in cryptanalysis threaten the viability of lattice cryptography — either through standard cryptanalysis or by finding a Shor’s-algorithm-like approach that can be performed by a quantum computer.

All of this demonstrates just how impressive the RSA public/private key algorithm was and continues to be: It works for encryption and signatures, has survived decades of scrutiny, and, until quantum computing, could respond to faster compute by increasing key sizes (even when quantum computers start to approach the power needed to break RSA, don’t be surprised if some PQ laggards opt to double their RSA key sizes to buy a few more years while they migrate to a new PQ algorithm — this isn’t the ideal approach since it adds another step to an eventual migration, but it’s a stopgap measure).

Implications For Data Protection

Organizations need to prepare for post-quantum and make changes. Once quantum computing enables sufficient processing power to break current encryption algorithms, there are common practices and purchasing decisions that set organizations up for future pain if they do not start planning for this future state now. Consider that:

  • Algorithm migration will take years or even decades. Back in 2020, we noted, “NIST deprecated the SHA-1 hashing algorithm in 2011 and security researchers demonstrated attacks against SHA-1 in 2017 — but in 2017, 21% of websites still used SHA-1 certificates, and it wasn’t until 2019 that Microsoft stopped signing Windows updates with SHA-1.” Upgrading hashing algorithms is a cakewalk compared to replacing an entire public key cryptosystem. Cryptography is typically embedded in the bowels of code and infrastructure, and differences in functionality, performance, and storage requirements make these swap-outs particularly challenging. Expect this to take a while, and remember the upstream effects of the software that your firm relies on.
  • Store now, decrypt later attacks mean that today’s internet traffic is not secret long term. A sufficiently powerful quantum computer doesn’t exist yet, but savvy attackers are preparing for the day that it does by collecting encrypted data (in transit or at rest) protected by public key cryptography. Those data elements remain secret today, but attackers are betting on being able to decrypt that traffic and data once quantum computing advances sufficiently. Once that happens, they will be privy to information such as personal emails, details of legal or corporate negotiations, and bank account numbers — and they could use this information to commit financial fraud, attempt blackmail, or embarrass governments and corporations. Firms who assumed their encrypted data was secure will have renewed cause for concern.
  • Data security practices need a second look. The risk vs. benefit calculus for data retention, migration, and storage practices, as well as purchasing decisions for technologies like data management, enterprise collaboration, and secure communications, will now have to account for crypto-agility and quantum readiness. Common data deletion practices that rely on revoking encryption keys will no longer be sufficient, and firms will need to review data security plans. Note that symmetric encryption algorithms like AES remain safe in a post-quantum world, but symmetric encryption keys protected with public key cryptography risk being exposed, which would in turn expose the encrypted data.

For many people, having multiple standardized PQ algorithms sounds daunting. How do we pick the right one? What if cryptanalysis reveals a flaw down the road? Given the timeline to migrate from RSA and the data protection issues we enumerated, NIST needed to recommend something now — not recommending anything is more problematic than recommending something that may be circumvented. Cryptoagility and industrywide collaboration on PQ migration will lift multiple industries and help protect our data long term. Experiment with the various standardized PQ algorithms according to your use case and performance needs, and identify and develop a plan for any potential migration issues.