Defending Encryption: Quantum XChange at Cisco Live Amsterdam
Back to Blogs & Podcasts
01 Apr 2026
Google just moved up the clock.
On March 25, 2026, Google announced that organizations should complete their post-quantum cryptography migration by 2029, two years ahead of the NSA’s 2031 target and six years ahead of the broader U.S. government benchmark of 2035. The reason: quantum computing hardware is advancing faster than most predicted, error correction is improving, and the math required to crack today’s encryption keeps getting easier for quantum systems to handle.
If you’re in planning or deployment on quantum readiness, this changes your timeline. More importantly, it should change how you think about your approach.
As Quantum XChange CEO Eddy Zervigon wrote this week: “The organizations that win in this transition won’t be the ones who pick the ‘right’ algorithm. They’ll be the ones who deliberately design for change, for true crypto-agility, for defensible diversification.”
That’s the frame for everything that follows.
Google’s researchers found that cracking RSA-2048 encryption may now require roughly one million noisy qubits, down from the 20 million previously estimated. That’s not a marginal improvement. It compresses the viable timeline for a Cryptographically Relevant Quantum Computer (CRQC) in a meaningful way.
But here’s what matters most for your security posture today: the threat is already active. Adversaries are running harvest-now, decrypt-later (HNDL) campaigns right now. They’re capturing your encrypted data in transit, storing it, and waiting. Whether Q-Day arrives in 2029 or 2032, data moving across your network today is already at risk.
Bain & Company research from early 2026 found that 90% of organizations don’t yet have systems in place to defend against quantum threats, despite 71% expecting quantum-enabled attacks within five years. Only one in ten has a formal roadmap.
That gap is the problem.
NIST finalized its first set of PQC standards in 2024. Deploying those algorithms is necessary. But it’s not sufficient, and treating it as the finish line is a mistake.
Here’s why.
Algorithms protect data where they’re implemented. But most organizations have vast amounts of data moving across networks every day through infrastructure that wasn’t designed with quantum threats in mind. Legacy systems, cloud connections, third-party integrations, and mixed vendor environments all create gaps that algorithm deployment at the application or OS layer can’t address.
Google itself acknowledged this, noting that it began preparing for PQC as early as 2016 and that the migration complexity reflects “real-world deployment, not theoretical readiness.” A decade of preparation from one of the most technically capable organizations on the planet. Most enterprises don’t have that runway.
The harder question isn’t which algorithm to pick. It’s what happens when that algorithm breaks. Because it will. NIST has already signaled that the current approved algorithms will evolve. Cryptanalysis doesn’t stop.
Zervigon put it plainly: “Algorithms will evolve because some will fail and others will replace them. That cycle won’t stop in the quantum era; it will accelerate. If your security model can’t adapt to that reality, it won’t survive it.”
If your quantum security strategy is built around a single algorithm or a fixed implementation, you don’t have a strategy. You have a single point of failure.
The network is the largest attack surface in any organization. It’s where data moves between systems, clouds, facilities, and users. It’s also where HNDL attacks do their work. Adversaries don’t need to breach your endpoints. They capture data in transit and wait.
This is where your post-quantum investment delivers the most return: protecting data in motion at the network layer.
OS-level PQC updates, like Google’s integration of ML-DSA into Android 17, are useful and necessary for those ecosystems. But they don’t protect data as it traverses your broader infrastructure. Every hop your data takes across a network that isn’t hardened is an opportunity for collection.
Network-layer protection addresses the broadest exposure with the fewest implementation touch points. It’s the highest-leverage place to start.
One of the biggest barriers to moving on quantum readiness is the assumption that migration means years of infrastructure replacement, downtime, and disruption. That’s not accurate, and it’s keeping organizations from acting.
The right architecture overlays existing infrastructure. It hardens what you already have without requiring you to rip out and replace systems. It works across mixed vendor environments, supports crypto-agility so you can swap algorithms when needed, and does it without taking networks offline.
That means quantum readiness doesn’t have to be a multi-year capital project. It can be an immediate operational step.
The organizations that will be in the best position when Q-Day arrives aren’t the ones who started last. They’re the ones who built for adaptability, not perfection. They can update algorithms quickly, they have visibility into what’s protecting what, and they’re not dependent on any single cryptographic standard holding up indefinitely.
Google’s 2029 deadline isn’t a reason to panic. It’s a reason to act with urgency and clarity.
Your data in motion is your most exposed asset. Protecting it at the network layer, without disrupting your operations, is where you get the most from your post-quantum investment right now. Build for crypto-agility from day one so algorithm changes are routine, not emergencies.
Have one of our experts show you how Phio TX protects your organization from threats today and the quantum future.
Request Request
a a
demo demo