The NIST PQC Standard: What Enterprises Need to Know

Aug. 08, 2024

An FAQ by Vince Berk, Chief Strategist at Quantum Xchange

 After eight years of submission review, testing, and evaluation, NIST has formally announced the post-quantum (PQC) standard specifications, setting in motion the largest cryptographic transition in the history of computing – replacing legacy encryption with PQCs.

Organizations are certain to have lots of questions related to this new standard and the inevitable, multiyear migration that will follow. Quantum Xchange’s Vince Berk addresses the most pressing in the FAQ below.

What are these new algorithms, and why do they matter?

Encryption, at its very core, depends on mathematical equations that are easy to compute in one direction, but hard in the other. For example: multiplying two numbers is an easy thing to do. But when I give a big number, and ask what numbers went into the multiplication to produce it… well that is a whole different problem to solve, and much harder.Encryption today depends on a few of these mathematical equations. In fact, there’s very few of them. It has not conclusively been proven that they are indeed hard to compute in reverse. We only guess that it is the case, based on our experience. In fact, Peter Shor showed that using a quantum computer the most common mathematical problems that underly cryptography can be broken. So, the new algorithms matter a lot, because they give us a larger number of options to secure our communications.

Is the quantum computer really a threat?

Not yet. Quantum computers are still cumbersome and complex, and live mostly in laboratories. The biggest quantum computers are still far too small to solve a complex problem such as breaking encryption. That doesn’t mean quantum computers aren’t getting bigger and bigger every year. They are, and the quantum threat is real.

The determining factor is the number of quantum-bits (qubits) that are the basis on which we can execute quantum algorithms. Current quantum computers have about 100 of them. To meaningfully break encryption, we probably need a few thousand. But some quantum computer roadmaps already show progress towards a lot more than that.  It is only a matter of time before quantum computers are big enough. At the current pace of innovation, such an announcement may be months, or years away.

All that notwithstanding, there is a tangible risk of storing encrypted data today and decrypting it in the future with a quantum computer – an attack vector known as “harvest now, decrypt later.” This applies to files, but also to telecommunications. What is captured today can be stored for future encryption. While expensive and cumbersome, if an adversary believes you are communicating important secrets, this approach may be well worth it to them. So even today we are seeing the need for new, quantum-safe encryption algorithms because of this harvesting risk.

How will PQC standard algorithms start to appear in the enterprise? What will the roll-out look like?

The roll-out has already begun. The NIST PQC algorithms and standardization project (launched in 2016) has been public for years. Only now has NIST created an official standard. A handful of web application companies have started to experiment with the new encryption algorithms. In anticipation of the pending standard, the algorithms have also been part of a few select web browsers. These modern browsers may very well be talking “quantum-safe” to certain web servers out on the Internet.

As website and web application providers slowly move to enable additional quantum-safe algorithms, it is expected that many browsers and servers will lag. In fact, it is believed that most servers will continue to accept browsers that are not quantum-safe for years to come, to ensure backward compatibility with older and outdated devices and operating systems. Other web servers, e.g., banking will likely force quantum-safe algorithms and demand that you use a modern and supported browser to access your accounts.

Additional protocols in the enterprise will see adoption as the applications are updated and upgraded. This will be implicit as software upgrades are common. These may include authentication protocols, e-mail, VPNs, secure-shell, databases, etc. Each of these will require both the server and the client to support the new PQC standards, which will require updates to both sides.

What kind of impact can users expect? Are systems compatible? What will break?

This will be a mixed bag, but things are going to break. In many protocols that are used today, backward compatibility is likely not going to be the common case. Especially where legacy technologies, IoT, and appliances are concerned. We can expect some bumps in the road in terms of adoption.

Take for instance, a perfectly functional MRI machine in a hospital will have a 25-year lifespan. The software on that machine works in all aspects just fine, except that it won’t support the new cryptographic algorithms. Updating an older machine is a scary proposition. If it goes offline without an easy fix, the hospital is losing productivity. As a result, the adage, “if it ain’t broke don’t fix it” permeates IT departments.

Many other examples exist. In many cases, authentication protocols such as LDAP aren’t even encrypted because they “just work.” Nobody logs a trouble ticket for systems that use weak cryptography. Upgrading the server-side to use stronger cryptography will break the client-side which was working fine using the older encryption methods.

Exactly what is going to break is hard to predict. Factors include a mix of technologies and applications in use in your environment.

What can enterprises do today to minimize this impact?

NIST has been encouraging enterprises to start with a cryptographic inventory. This is a smart place to start because it not only gives you an idea of where outdated or weak cryptography is used in the enterprise (and thus a starting point for upgrades), it also shows which system are most likely to break when new encryption algorithms automatically start rolling out. View our NIST co-hosted webinar, PQC Migration: The Discovery Phase.

Surprisingly, most organizations do not have a good handle on the cryptography used today (see survey results and press release).  If anything at all, it is generally the server-side that is managed or monitored. This means that the certificates on the server are managed and ensured to be modern and strong. However, it is generally the client-side that breaks in upgrades, and this is much harder to see in conventional key management approaches.

In order to ensure clients, IoT, or legacy devices don’t refuse service, it is important to get a comprehensive view of cryptography as it is used in the enterprise, not how it was designed to be used.  A true picture of which clients try to downgrade, and which ones are still using unencrypted communications. Thankfully most of these conditions can be readily monitored and discovered by watching the network.

What is the current state of cryptography? What are current cryptography risks?

The state of cryptography in the enterprise is dismal. Cryptography suffers from lots of risk, not just a quantum computer.  There is no guarantee that the traditional algorithms have not been broken – we just believe this is unlikely. We have seen the slow fall from grace for MD5 and 3DES – once considered sufficient, are no longer. Even in a conventional computing world, as compute power advances, older cryptographic techniques become unsafe.

This is compounded by the fact that cryptography is implemented in software and software has bugs.  In some cases, as many as 20 bugs per 1,000 lines of code. This means that even though you may trust the algorithm, you still have a risk in the way it was coded in software.

Other risks include insider threats, such as who has the keys that were installed in the enterprise. Check out our eBook, Single Points of Failure in Cryptography.

Will these new algorithms remediate all cryptographic deficiencies in the enterprise?

Most certainly not. Having additional encryption methods means we might be able to remediate some of cryptography’s single points of failure. For instance, by using two different algorithms to encrypt your data, now both must be broken. Similarly, by using different algorithms you would be using multiple software stacks, reducing the risk that both have a software bug.

Options are helpful, because we can create redundancies. Much like you don’t store your family photographs on a single hard drive. By creating an array (RAID), you can suffer one or more drive failures without losing the pictures. The same can be done with encryption to ensure data confidentiality and integrity, even if a single component of the cryptographic stack fails due to a quantum computer, software bug, or data breach!

What will be the longevity of these algorithms? Can we confidently trust them?

Probably long, but difficult to say. Very smart mathematicians and cryptographers have worked on these algorithms. Thousands of people have tried to defeat them and failed. While the PQC standard algorithms are good (for now), we cannot mathematically rule out that they cannot be  broken in time (see solutions brief).

This means enterprises will face additional migrations in the future. This might be required 5, 10, or 50 years from now.  Because of this unpredictability, it makes a ton of sense for organizations to begin to actively control and manage their cryptography. Until now, cryptography in the enterprise was mostly implicit. Putting more controls around cryptography now will make future transitions easier (Full Visibility, Management and Control Over Your Enterprise Encryption).

This strategic approach is called crypto-agility. Being agile with your cryptography means you can more easily switch when you need to without causing stuff to break.

How might an enterprise go about becoming truly crypto-agile?

The tools of the crypto-agility trade are just starting to be born. Much like any control process in the enterprise, it begins with a decision to dedicate time and resources to the problem.  In the case of control, this typically means a few things:

  1. A way to monitor what the status quo is
  2. A lever to control the status quo
  3. A set of principles to decide how to make the changes; and:
  4. A target state

Most of these parts and pieces can be put in place today. The inevitable migration to the new NIST PQC standard is a great opportunity to rethink your cryptography and security strategy moving forward, collectively making our data, systems, device, and communications stronger and more secure for all.

Subscribe to the Quantum Xchange Monthly Newsletter

Quantum Xchange does not share or rent your information to any third parties.