Back in 1999, everybody caught the “Y2K” bug. According to Y2K’s “prophecy of doom”, the transition into a new millennium would wreak havoc on computer networks globally and ultimately bring our entire civilization to a grinding halt.
Y2K turned out to be a damp squib.
But it’s highly unlikely that Y2Q or “years to quantum” will be as easy to tackle. Y2Q is approaching – and fast!
Experts like Michele Mosca, Deputy Director at the , believe that the odds of reaching Y2Q by 2026 are 1 in 7 and 1 in 2 by 2031. When Y2Q becomes a reality, quantum computers will easily break the outdated cryptographic protocols we currently rely on to protect our systems and data. Assets worth US$3.5 trillion are at risk because they still rely on outdated cryptography!
Currently, the best way to ward off future quantum attacks is to develop stronger quantum-resistant encryption. And of the many approaches that are being developed today, post-quantum cryptography (PQC) appears to be the most promising.
But even though PQC has garnered support from governments thanks to its cost-effectiveness, many PQC methods work well only in the lab. In unpredictable real-world environments, they can struggle to stand up to scrutiny. Not only that, but they can be difficult to deploy (though certainly not as difficult as QKD).
Here are the three main drawbacks of PQC-based systems you need to be aware of. $3.5 trillion worth of assets is at stake!
Large Key Sizes & Performance Costs
Most PQC systems require much larger keys than classical public-key algorithms. While the large key sizes make PQC algorithms more secure, they have serious implications for the performance of quantum-resistant cryptography.
Compared to legacy public-key cryptosystems, PQC algorithms can take more time to encrypt and decrypt messages. Not only that, but the larger keys occupy more storage space, need more memory, and require more network bandwidth.
At smaller scales and with small amounts of data, you might not even notice the performance cost of quantum-resistant cryptography. However, once you start transmitting and handling thousands of different keys simultaneously, the performance impact of PQC will begin to add up.
Old infrastructures with outdated hardware may be unable to keep up with the performance requirements of PQC. Worse, PQC might affect latency-sensitive workloads, like computer vision systems in autonomous vehicles. Resource-constrained devices like smartphones or IoT devices may be unable to run PQC as well.
The bottom line is that you may need to upgrade your infrastructure to support your transition to PQC. So even though PQC can be delivered to any device through software, the potential need to upgrade your infrastructure might make PQC’s deployment very challenging.
There is no way around this – if you want to protect yourself from quantum threats, you’ll need to accept some costs. That being said, some PQC algorithms are more efficient than others – you’ll want to use them to protect your infrastructure.
Non-Ideal Scalability
Many PQC algorithms struggle to maintain their hardness (resistance to attacks) at scale. As an example, lattice-based cryptography, which is one of the promising PQC techniques being researched, scales well but only achieves average-case hardness. In simple terms, average-case hardness means that lattice-based cryptography is resistant to most (but not all) quantum attacks.
It appears that scalability and encryption hardness are competing qualities. We can excel at one or the other, but not both. That being said, this might be true only for the PQC systems that are currently in development. In the future, researchers and cybersecurity vendors might be able to come up with solutions that can maintain their hardness at any scale.
Vulnerability to Advancements in Quantum Tech
Unlike quantum cryptography and, in particular, quantum key distribution (QKD), quantum-resistant cryptography will be sensitive to increasing quantum computing power. QKD is based on quantum mechanics and is theoretically impervious to attacks from quantum computers, regardless of their computing power. QKD has many issues that limit its real-world security, but it provides future-proof security at least in theory.
Now, the vulnerability of quantum-resistant cryptography to advancements in quantum technology is a very long-term problem – we probably won’t have to worry about it for quite some time. Still, this is something that we need to keep in mind going forward.
As quantum computers become more powerful, early PQC algorithms may need to be upgraded or replaced entirely. Sure, we probably will be able to somewhat offset the increasing power by using longer cryptographic keys. But eventually, PQC might become defenseless against very advanced and powerful quantum computers.
There’s also the possibility that researchers will come up with some sort of quantum algorithm that can easily solve the math that underlies PQC. Just like Shor’s algorithm shattered our assumptions about classical cryptography, researchers may someday come up with a neat trick that will be able to easily defeat PQC.
Prepare For Quantum With QiSpace™
At Quantropi, we believe that proactiveness and agility are the keys to protecting ourselves from the attacks of both today and tomorrow.
As the only quantum security provider in the world offering the 3 prerequisites for cryptographic integrity – Trust, Uncertainty, and Entropy (TrUE) – Quantropi is a force to be reckoned with in quantum cybersecurity. Our patented TrUE technologies provide end-to-end protection across entire enterprise infrastructures, starting from on-premises data centers and ending with communications between employees.
A highlight of TrUE is MASQ™ – Quantropi’s novel PQC algorithm that uses much smaller keys than NIST finalists. MASQ™ maintains performance similar to classical algorithms while offering the same or better quantum-safe protection.
All of Quantropi’s TrUE technologies are accessible via our flagship QiSpace™ platform. Talk to us today to learn more about QiSpace™ and how it can help you prepare for the future!
Experts like Michele Mosca, Deputy Director at the University of Waterloo’s Institute for Quantum Computing believe that the odds of reaching Y2Q by 2026 are 1 in 7 and 1 in 2 by 2031. When Y2Q becomes reality, quantum computers will easily break the outdated cryptographic protocols we currently rely on to protect our systems and data. That’s assets worth US$3.5 trillion that are at risk because they still rely on outdated cryptography!
Currently, the best way to ward off a possible future quantum attack is to develop stronger quantum-resistant encryption (aka post-quantum cryptography or PQC). But the truth is, most PQC methods work well only in the lab. In unpredictable real-world environments, they just cannot stand up to scrutiny. Moreover, researchers at the University of Waterloo’s erstwhile Quantum Hacking Lab have demonstrated that theoretically-perfect PQC is not as ‘unhackable’ or ‘quantum-proof’ as its supporters claim.
Here are the 3 drawbacks of PQC-based systems we need to be aware of. $3.5 trillion worth of assets are at stake!
Increased Transition Complexity
Moving to a PQC-based system will affect the performance of an organization’s current cryptographic infrastructure since it will involve more computations and therefore an increased workload. It may even render some parts of the system obsolete, raising the need for replacement hardware and adding to transition complexity. As system complexity increases, it will also increase costs and lengthen timelines.
As an organization starts thinking about the move from classical to PQC-based encryption, it cannot ignore these disadvantages. Meanwhile, its vulnerability to quantum attacks keeps increasing.
Difficult to Scale
Many PQC algorithms are notoriously difficult to scale. For example, for lattice-based cryptography, which is a popular method for post-quantum cryptography, it is very difficult to prove its ‘hardness’ (a measure of an algorithm’s resilience to attacks) at scale. Current lattice algorithms that do manage to scale well only achieve average-case hardness. Thus, there is a trade-off between hardness and scalability. Either can be achieved, but not both.
Larger Key Sizes & Limited Speeds
Most PQC algorithms require much larger key sizes than existing public key algorithms. For example, multivariate cryptography, which is also considered a good basis for PQC, involves very large key sizes, which require more storage inside a device. They also result in large amounts of data to be sent over a communications system for key establishment and signatures. Therefore more time is required to encrypt and decrypt messages, or to verify signatures at either end. This limits transfer speeds, which can be dangerous in case of a sudden quantum attack.
At Quantropi, we believe that every organization needs to harden today’s defences against today’s attacks AND tomorrow’s attacks by quantum computers. We’re the only cybersecurity company in the world providing the 3 prerequisites for cryptographic integrity: Trust, Uncertainty, and Entropy (TrUE). Powered by quantum mechanics expressed as linear algebra, our patented TrUE technologies establish Trust between any two parties via quantum-secure asymmetric MASQ™ encryption (coming soon); ensure Uncertainty to attackers, rendering data uninterpretable forever, with QEEP™ symmetric encryption; and provide Quantum Entropy as a Service (QEaaS) with SEQUR™ – ultra-random key generation and distribution to enable secure data communications. All Quantropi’s TrUE technologies are accessible via our flagship QiSpace™ platform.
Talk to us today!