When we send encrypted information over a public communication channel, our security models must assume adversaries are recording that information in the hopes of being able to eventually break the encryption and exploit the underlying plaintext. An encryption algorithm believed to be secure today could cease to be in the future due to new advances in number theory, new cryptanalytic techniques, or even new methods of computing. It is this last risk, in particular the risk posed by the potential future development of large-scale, fault-tolerant quantum computers, that is currently the focus of much of the international cryptographic research community, driven by a worldwide open competition to select and standardize new post-quantum (a.k.a. quantum-resistant) public-key cryptographic algorithms. As we approach the first output milestone in that competition, it is critical for everyone in our industry to be aware of the coming algorithm transition, the impact it will have on existing and future systems, and the research and engineering work still needed to make the transition to post-quantum cryptography (PQC) possible.
From mobile communications to online banking to personal data privacy, literally billions of Internet users rely on cryptography every day to ensure private communications and data stay private. Indeed, the emergence and growth of the public Internet and electronic commerce was arguably enabled by the invention of public-key cryptography. The critical advantage offered by public-key cryptography is that it allows two parties who have never communicated previously to nevertheless establish a secure, private, communication channel over a non-private network (that is, the Internet). Public-key cryptography is also the technology that enables digital signatures, which are widely used to protect software and application updates, online contracts, and electronic identity credentials.
Since 1994, when Peter Shor of AT&T Bell Laboratories developed the polynomial-time quantum factoring algorithm that now bears his name, we have known that all our widely deployed public-key cryptographic algorithms can be attacked efficiently with the aid of a cryptographically relevant (that is, "big enough") quantum computer. Whether such a quantum computer could even be built was and still is a purely theoretical question. However, while today's quantum computers are not big enough or stable enough to threaten our current algorithms, they point the way to future devices that could. Further, while a cryptographically relevant quantum computer may not be realized for a decade or longer, its future existence is a threat to the security of information we send and receive today due to the ability to record content now for later exploitation. The threat of record now, exploit later means we need to transition to using quantum-resistant public-key algorithms well in advance of the availability of cryptographically relevant quantum computers.
Acknowledging the threat to existing cryptography posed by future quantum computers, the U.S. National Security Agency (NSA) first warned the public of the need to transition to PQC algorithms in August 2015, and in 2017 the U.S. National Institute of Standards and Technology (NIST) launched its PQC Standardization activity to select new quantum-resistant public-key algorithms. As with its prior algorithm competitions that resulted in the AES block cipher and SHA-3 hash function standards, NIST solicited PQC algorithm proposals and cryptanalysis of them from around the world. Chosen algorithms win the "prize" of being standardized as U.S. Federal Information Processing Standards (FIPS) and then being used as replacement algorithms just about everywhere public-key cryptography is used.
By the time you read this column in the January 2022 issue of Communications, NIST may have announced its first PQC algorithm selections for standardization.a These will be the first quantum-resistant public-key algorithms to be standardized through an open, competitive process, yet they will be just the first of at least three waves of "winners" to be announced by NIST. When NIST began the PQC selection process in November 2017, it received approximately 70 submissions from teams of industry and academic researchers around the world, which have now been winnowed down to seven Finalists and eight Alternates.b The selections expected now are from the Finalists category; we expect a second wave of winners based on the Alternates in mid-late 2023. Additionally, next year NIST will reopen the digital signature portion of the competition to new proposals; winners from this "2nd Call for Proposals" may not be announced until 2026.
Much of the post-quantum transition work that needs to be done can start now, even while we wait for algorithm and protocol standards to be updated.
Selection of an algorithm and publication of a standards document describing its operation and security parameters is but the first step in migrating to it. During my 24+-year career at Microsoft, I have worked on four major cryptographic algorithm transitions: the MD5 to SHA-1 and SHA-1 to SHA-2 hash function migrations, the move from 1024-bit RSA to 2048-bit RSA, and the migration from RSA to elliptic curve cryptography (ECC). All four of those transitions were simpler by comparison to what we face in moving from RSA or ECC to a PQC scheme, and our experience with past migrations is that it takes considerable time and effort for an industry to come together to update standards with new algorithms and then deploy them. Even apparently simple replacements take significant time; the migration from the SHA-1 hash function (deprecated in 2011) to its replacement SHA-2 is still ongoing, and our transition to ECC public-key cryptography (the closest model we have for the complexity of the move to PQC), is still struggling to gain traction in some areas. Transitioning to a new algorithm takes more time, is more complicated, and impacts more functions and features than expected. Each protocol migration has its own standardization community and updating is a bespoke process. Long-term support requirements for existing systems also slow the process of decommissioning old algorithms.
NIST's announcement of the first set of PQC winners will thus be the starting gun for updating security protocols that use public-key cryptography, and as many of our protocols depend upon other standards and protocols, updating a single protocol may require updates to multiple standards. Given the time it takes to update any standard through its relevant process, this cascade of dependencies will extend the transition time for even our most used security protocols. Further, there is a long tail of security protocols with small implementor communities; these protocols will not migrate to PQC as quickly as commonly used protocols such as TLS, SSH, and so forth, and they will likely be more difficult to update as well. Taking all these timelines and dependencies together, it is likely that by 2030 we will still be in a "PQ-messy" state in terms of updating standards to PQC.
Much of the post-quantum transition work that must be done can start now, even while we wait for algorithm and protocol standards to be updated. A useful starting point for organizations is the NIST-managed National Cybersecurity Center of Excellence's Migration to Post-Quantum Cryptography project,c which is developing migration playbooks, recommended practices, and proof-of-concept tools. Organizations should also note the life cycles of the standards relevant to their activities and what work those standards groups are already doing to prepare for the PQC migration. Awareness is key—simply knowing to ask the question, "Is this system designed to accommodate the transition to PQC?" is crucial.
The engineering work needed to prepare systems for the PQC migration can also start now, such as by integrating and testing with implementations of the candidate PQC algorithms or a PQC-enlightened protocol.d The key feature to ensure in new or upgraded systems is cryptographic agility, the ability to reconfigure a system, application, or protocol with a different cryptographic algorithm, including algorithms not yet defined at the time it was initially built. Cryptographic agility is not a new a security design principle, but it is often either overlooked completely in cryptography-using systems or knowingly deferred because it is viewed as a security tax with no immediate payoff. Taking steps now to ensure systems under development have sufficient cryptographic agility will avoid future pain.
This is also the time to incentivize new research to help with the many challenges associated with cryptographic algorithm migrations and the building of cryptographically agile systems. Research in new tools and frameworks for ensuring cryptographic agility that can be configured and controlled by policy are sorely needed. Gaps are especially notable in agility frameworks for large-scale infrastructures (for example, private data centers and public clouds) and distributed application architectures (for example, Web services, distributed container schemes). Agility frameworks must not only enable policy-driven configuration, but they must also be protected from adversarial tampering and attack. How agility creates new attack surfaces and how those attack surfaces will be protected is another key area of research. In addition, research on the usability of transition tools, human factors, and organizational factors in secure coding is needed to support the security workforce during this transition.
The new algorithms being considered by NIST will have significantly different parameters (key sizes, ciphertext sizes, signature sizes) and computational requirements (memory, processing cycles, network latencies), as well as some new requirements (for example, entropy, failure handling). As such, research is also needed to examine the implications of these differences for a wide variety of cryptography usage domains and to understand performance challenges. One particular area of concern is whether the new PQC algorithms will be sufficiently performant and energy efficient on constrained IoT devices with limited compute resources and battery power.
Quantum computers are coming, even if we do not know today when they will achieve scale or become commonplace and accessible. While they promise to enable new computational workloads that are not feasible with classical computing, they also represent an existential threat to our widely deployed cryptographic algorithms and standards. Because the information we encrypt and transmit with today's public-key cryptography can be recorded by an adversary and stored for potential exploitation, it is urgent that we work now to counter the implications of future quantum computing for today's cryptography. The sooner we make our systems cryptographically agile and migrate them to post-quantum cryptographic algorithms, the more quickly we can move to counter this future threat. This algorithm migration will be more involved and complicated than any we have faced in the past, so early awareness, education, and planning are key. The more we can do today to prepare for this disruption, the easier our overall migration journey will be as an industry.
a. See https://bit.ly/3wERLhP
b. Disclosure: I and members of my team at Microsoft Research are co-submitters of two Alternate encryption proposals (FrodoKEM and SIKE) and one Alternate digital signature proposal (Picnic).
c. See https://bit.ly/3olfekw
d. For example, the University of Waterloo's Open Quantum Safe project, https://open-quantumsafe.org/, maintains open source crypto-agile PQC libraries of all the Finalists as well as test integrations into widely used security protocol implementation like OpenSSL.
This Viewpoint is adapted and revised from the CRA Quadrennial paper Post Quantum Cryptography: Readiness Challenges and the Approaching Storm, Campagna, M., LaMacchia, B., and Ott, D. (2020); https://bit.ly/3HgMkun
The Digital Library is published by the Association for Computing Machinery. Copyright © 2022 ACM, Inc.
No entries found