The evolution of computing has always involved significant technological advancements. The latest advancements are a giant leap into quantum computing era. Early computers, like the ENIAC, were large and relied on vacuum tubes for basic calculations. The invention of transistors and integrated circuits in the mid-20th century led to smaller, more efficient computers. The development of microprocessors in the 1970s enabled the creation of personal computers, making technology accessible to the public.
Over the decades, continuous innovation exponentially increased computing power. Now, quantum computers are in their infancy. This is using quantum mechanics principles to address complex problems beyond classical computers’ capabilities. This progression marks a dramatic leap in computational power and innovation.
Table of Contents
Quantum Computing Basics and Impact
Quantum computing originated in the early 1980s, introduced by Richard Feynman, who suggested that quantum systems could be more efficiently simulated by quantum computers than classical ones. David Deutsch later formalized this idea, proposing a theoretical model for quantum computers.
Quantum computing leverages quantum mechanics to process information differently than classical computing. It uses qubits, which can exist in a state 0, 1 or both simultaneously. This capability, known as superposition, allows for parallel processing of vast amounts of information. Additionally, entanglement enables qubits to be interconnected, enhancing processing power and communication, even across distances. Quantum interference is used to manipulate qubit states, allowing quantum algorithms to solve problems more efficiently than classical computers. This capability has the potential to transform fields like cryptography, optimization, drug discovery, and AI by solving problems beyond classical computer’s reach.
Security and Cryptography Evolution
Threats to security and privacy have evolved alongside technological advancements. Initially, threats were simpler, such as physical theft or basic codebreaking. As technology advanced, so did the sophistication of threats, including cyberattacks, data breaches, and identity theft. To combat these, robust security measures were developed, including advanced cybersecurity protocols and cryptographic algorithms.
Cryptography is the science of securing communication and information by encrypting it into codes that require a secret key for decryption. Classical cryptographic algorithms are two main types – symmetric and asymmetric. Symmetric, exemplified by AES, uses the same key for both encryption and decryption, making it efficient for large data volumes. Asymmetric key cryptography, including RSA and ECC for authentication, involves public-private key pair, with ECC offering efficiency through smaller keys. Additionally hash functions like SHA ensure data integrity and Diffie-Hellman for key exchanges methods which enable secure key sharing over public channels. Cryptography is essential for securing internet communications, protecting databases, enabling digital signatures, and securing cryptocurrency transactions, playing a vital role in safeguarding sensitive information in the digital world.
Public key cryptography is founded on mathematical problems that are easy to perform but difficult to reverse, such as multiplying large primes. RSA uses prime factorization, and Diffie-Hellman relies on the discrete logarithm problem. These problems form the security basis for these cryptographic systems because they are computationally challenging to solve quickly with classical computers.
Quantum Threats
The most concerning aspect of the transition to a quantum computing era is the potential threat it poses to current cryptographic systems.
Encryption breaches can have catastrophic outcomes. This vulnerability risks exposing sensitive information and compromising cybersecurity globally. The challenge lies in developing and implementing quantum-resistant cryptographic algorithms, known as post-quantum cryptography (PQC), to protect against these threats before quantum computers become sufficiently powerful. Ensuring a timely and effective transition to PQC is critical to maintaining the integrity and confidentiality of digital systems.
Comparison – PQC, QC and CC
Post-quantum cryptography (PQC) and quantum cryptography (QC) are distinct concepts.
Below table illustrates the key differences and roles of PQC, Quantum Cryptography, and Classical Cryptography, highlighting their objectives, techniques, and operational contexts.
Feature | Post-Quantum Cryptography (PQC) | Quantum Cryptography (QC) | Classical Cryptography (CC) |
---|---|---|---|
Objective | Secure against quantum computer attacks | Use quantum mechanics for cryptographic tasks | Secure using mathematically hard problems |
Operation | Runs on classical computers | Involves quantum computers or communication methods | Runs on classical computers |
Techniques | Lattice-based, hash-based, code-based, etc. | Quantum Key Distribution (QKD), quantum protocols | RSA, ECC, AES, DES, etc. |
Purpose | Future-proof existing cryptography | Leverage quantum mechanics for enhanced security | Secure data based on current computational limits |
Focus | Protect current systems from future quantum threats | Achieve new levels of security using quantum principles | Provide secure communication and data protection |
Implementation | Integrates with existing communication protocols | Requires quantum technologies for implementation | Widely implemented in existing systems and networks |
Insights into Post-Quantum Cryptography (PQC)
The National Institute of Standards and Technology (NIST) is currently reviewing a variety of quantum-resistant algorithms:
Cryptographic Type | Key Algorithms | Basis of Security | Strengths | Challenges |
---|---|---|---|---|
Lattice-Based | CRYSTALS-Kyber, CRYSTALS-Dilithium |
Learning With Errors (LWE), Shortest Vector Problem (SVP) | Efficient, flexible; strong candidates for standardization | Complexity in understanding and implementation |
Code-Based | Classic McEliece | Decoding linear codes | Robust security, decades of analysis | Large key sizes |
Hash-Based | XMSS, SPHINCS+ | Hash functions | Straightforward, reliable | Requires careful key management |
Multivariate Polynomial | Rainbow | Systems of multivariate polynomial equations | Shows promise | Large key sizes, computational intensity |
Isogeny-Based | SIKE (Supersingular Isogeny Key Encapsulation) | Finding isogenies between elliptic curves | Compact key sizes | Concerns about long-term security due to cryptanalysis |
As summarized above, Quantum-resistant cryptography encompasses various approaches. Each offers unique strengths, such as efficiency and robustness, but also faces challenges like large key sizes or computational demands. NIST’s Post-Quantum Cryptography Standardization Project is working to rigorously evaluate and standardize these algorithms, ensuring they are secure, efficient, and interoperable.
Quantum-Ready Hybrid Cryptography
Hybrid cryptography combines classical algorithms like X25519 (ECC-based algorithm) with post-quantum algorithms often referred as “Hybrid Key Exchange” to provide dual layer of security against both current and future threats. Even if one component is compromised, the other remains secure, ensuring the integrity of communication.
In May 2024, Google Chrome enabled ML-KEM (a post-quantum key encapsulation mechanism) by default for TLS 1.3 and QUIC enhancing security for connections between Chrome Desktop and Google Services against future quantum computer threats.
Challenges
ML-KEM (Module Lattice Key Encapsulation Mechanism), which uses lattice-based cryptography, has larger key shares due to its complex mathematical structures and needs more data to ensure strong security against future quantum computer threats. The extra data helps make sure the encryption is tough to break, but it results in bigger key sizes compared to traditional methods like X25519. Despite being larger, these key shares are designed to keep data secure in a world with powerful quantum computers.
Below table provides a comparison of the key and ciphertext sizes when using hybrid cryptography, illustrating the trade-offs in terms of size and security:
Algorithm Type | Algorithm | Public Key Size | Ciphertext Size | Usage |
---|---|---|---|---|
Classical Cryptography | X25519 | 32 bytes | 32 bytes | Efficient key exchange in TLS. |
Post-Quantum Cryptography |
Kyber-512 | ~800 bytes | ~768 bytes | Moderate quantum-resistant key exchange. |
Kyber-768 | 1,184 bytes | 1,088 bytes | Quantum-resistant key exchange. | |
Kyber-1024 | 1,568 bytes | 1,568 bytes | Higher security level for key exchange. | |
Hybrid Cryptography | X25519 + Kyber-512 | ~832 bytes | ~800 bytes | Combines classical and quantum security. |
X25519 + Kyber-768 | 1,216 bytes | 1,120 bytes | Enhanced security with hybrid approach. | |
X25519 + Kyber-1024 | 1,600 bytes | 1,600 bytes | Robust security with hybrid methods. |
In the following Wireshark capture from Google, the group identifier “4588” corresponds to the “X25519MLKEM768” cryptographic group within the ClientHello message. This identifier indicates the use of an ML-KEM or Kyber-786 key share, which has a size of 1216 bytes, significantly larger than the traditional X25519 key share size of 32 bytes:

As illustrated in the images below, the integration of Kyber-768 into the TLS handshake significantly impacts the size of both the ClientHello and ServerHello messages.

Future additions of post-quantum cryptography groups could further exceed typical MTU sizes. High MTU settings can lead to challenges such as fragmentation, network incompatibility, increased latency, error propagation, network congestion, and buffer overflows. These issues necessitate careful configuration to ensure balanced performance and reliability in network environments.
NGFW Adaptation
The integration of post-quantum cryptography (PQC) in protocols like TLS 1.3 and QUIC, as seen with Google’s implementation of ML-KEM, can have several implications for Next-Generation Firewalls (NGFWs):
- Encryption and Decryption Capabilities: NGFWs that perform deep packet inspection will need to handle the larger TLS handshake messages due to ML-KEM larger key sizes and ciphertexts associated with PQC. This increased data load can require updates to processing capabilities and algorithms to efficiently manage the increased computational load.
- Packet Fragmentation: With larger messages exceeding the typical MTU, resulting packet fragmentation can complicate traffic inspection and management, as NGFWs must reassemble fragmented packets to effectively analyze and apply security policies.
- Performance Considerations: The adoption of PQC could impact the performance of NGFWs due to the increased computational requirements. This might necessitate hardware upgrades or optimizations in the firewall’s architecture to maintain throughput and latency standards.
- Security Policy Updates: NGFWs might need updates to their security policies and rule sets to accommodate and effectively manage the new cryptographic algorithms and larger message sizes associated with ML-KEM.
- Compatibility and Updates: NGFW vendors will need to ensure compatibility with PQC standards, which may involve firmware or software updates to support new cryptographic algorithms and protocols.
By integrating post-quantum cryptography (PQC), Next-Generation Firewalls (NGFWs) can provide a forward-looking security solution, making them highly attractive to organizations aiming to protect their networks against the continuously evolving threat landscape.
Conclusion
As quantum computing advances, it poses significant threats to existing cryptographic systems, making the adoption of post-quantum cryptography (PQC) essential for data security. Implementations like Google’s ML-KEM in TLS 1.3 and QUIC are crucial for enhancing security but also present challenges such as increased data loads and packet fragmentation, impacting Next-Generation Firewalls (NGFWs). The key to navigating these changes lies in cryptographic agility—ensuring systems can seamlessly integrate new algorithms. By embracing PQC and leveraging quantum advancements, organizations can strengthen their digital infrastructures, ensuring robust data integrity and confidentiality. These proactive measures will lead the way in securing a resilient and future-ready digital landscape. As technology evolves, our defenses must evolve too.
We’d love to hear what you think. Ask a Question, Comment Below, and Stay Connected with Cisco Secure on social!
Cisco Security Social Channels
Instagram
Facebook
Twitter
LinkedIn
Share:
Leave a Reply