In our previous blog, we focused on the Post-Quantum
Cryptography (PQC) standardization effort, led by the National Institute of Standards and Technology (NIST), and the selected
PQC standards of the future. In this blog post, we discuss some of the challenges associated with the PQC migration and examine
the strategies proposed for dealing with those challenges.
The announced PQC standards and their recently published drafts bring us a step closer to the broad deployment of PQC. The PQC migration process will be a highly significant
transformation in the public-key cryptography landscape to date, impacting billions of devices and the world’s digital security
The PQC Migration: A Road Paved with Challenges
Modern digital infrastructures are profoundly reliant on traditional asymmetric cryptography based on RSA or ECC schemes. The
migration to PQC will entail numerous challenges. First, scheme and implementation aspects have to be considered. This includes
not only key, ciphertext and signature sizes, but also memory and efficiency impacts. Second, many public-key infrastructures
need to be updated. In particular, some of the schemes being standardized by the NIST are functionally different from RSA- and
ECC-based schemes, meaning that widely used protocols need to be modified accordingly. Third, the timeline at which migration is
necessary — or even desirable — has to be the result of a careful risk analysis of each individual use case. For
example, high-impact infrastructures are a more likely target for malicious entities than an IoT device designed for smart
homes. The former should have a migration plan in place now, but the latter might never need to migrate.
Beyond migration, it is important to note that the PQC algorithms being standardized are less mature than their traditional
counterparts, especially regarding physical security. When it comes to embedded devices, particularly ones deployed in hostile
environments, protecting against physical adversaries comes with its own set of challenges. For instance, we have discussed the
difficulty of hardening most of the PQC Key Encapsulation Mechanisms (KEMs), and some
unconventional approaches to hinder attacks for some use cases. It is clear that, wherever PQC is deployed, the security of mechanisms or protocols relying on PQC is critical: it is
paramount to create a system that preserves the current security offered by traditional cryptography while adding protection
against quantum attackers.
NXP is working with customers to protect against the threats posed by quantum computers. Read more on our dedicated Post-Quantum Cryptography page.
Hybrid PQC Mechanisms: A Solution to Hedge Your Bets
One possible way to achieve both traditional and post-quantum protection is the use of Hybrid PQC, which combines a traditional
asymmetric cryptographic scheme with a post-quantum scheme. Many national agencies, including the
German BSI and the
French ANSSI , are recommending hybrid approaches. In a hybrid system, security relies on two or more cryptographic schemes: breaking
only one of the schemes does not entail a full break of the system. In general, the communication or storage overhead of
performing the traditional algorithm alongside PQC will be minimal compared to performing PQC on its own, so this hedging comes
at a relatively low cost. For digital signatures, deployment could be as simple as including both a traditional signature (ECDSA
or RSA) and a PQC signature (ML-DSA, SLH-DSA, LMS or XMSS), and checking both, where both need to pass verification. For key
establishment, the situation is a little more complex, because combining a KEM (such as ML-KEM) with ECDH(E) requires some care.
Also, the suggested approach for the
TLS handshake has
a different approach than that of
Standards and guidelines for hybrid mechanisms are still developing and will enable interoperability. For example, in key
exchange, standards and guidelines ensure that the communicating parties put their inputs to a key-derivation function in the
correct order and format, so as to both calculate the same session key and proceed reliably with further secure communication.
These standards and guidelines will also help ensure that products and systems avoid using weaker mechanisms.
Cryptographic Agility: An Ambitious Goal
Mitigating the impact of possible necessary future updates — to support future PQC standards or even to mitigate against
cryptanalytic advances as soon as they occur — can be achieved with cryptographic agility. Cryptographic agility can be
defined as the capacity for a system to be easily altered to fit new security or regulatory requirements when needed. It not
only entails migrating from one algorithm to another but also involves adopting other approaches, such as general flexibility,
with respect to implementations or security parameters.
For resource-constrained embedded devices, any kind of cryptographic agility comes at a steep price. It is important to evaluate
the gain-to-cost ratio, through risk analysis, and to assess if the infrastructure can adapt to alternative paradigms. For
example, a conservative approach could be to revert from PKI-based key establishment to pre-shared symmetric keys, which can be
used to securely swap out crypto-systems, even if the PKI-based key establishment used for regular updating is compromised. For
many use cases, however, this would not be viable, either due to the increased key storage requirements and attack surface, or
because it is difficult to predict which devices might be paired in the future. Unfortunately, increasing agility has the
potential to introduce added complexity and vulnerabilities. It must be ensured that any alteration or adaptation to a system
can only be initiated by an authenticated source and cannot lead to a downgrade in security. For the pre-quantum world, NXP
achieves this with e.g. its root of trust in the Edgelock Secure Enclave and Edgelock Secure Elements.
These concerns and more will form part of a PQC risk-assessment exercise. For devices that will use (hybrid) PQC in the near
term, it is already pertinent to assess the capability of transitioning to and supporting the new algorithms in order to
minimize the timeframe required for a full system migration.
To ensure a smooth migration, significant efforts in applied research, engineering and standardization are still required.
Without doubt, there is a strong need to define, evaluate and standardize migration solutions and strategies to ensure both
security and interoperability. For now, providers of cryptographic assets should be aware of the need to develop a comprehensive
cryptographic inventory to be in a strong position to rapidly roll out agile solutions. This new hybrid or PQC future highlights
the importance and relevance of cryptographic agility as an enabler for anticipating and addressing cryptographic threats for
years to come.