The transition toward cyber resilience in the face of quantum computer–enabled attacks is more relevant than ever for organizations. Experts and authorities agree that between 2030 to 2035, critical systems will need to have migrated to post-quantum cryptography (PQC) solutions.
But this transition is far from trivial. It entails a deep transformation of infrastructures, processes, and governance. Approaching such a project without a preliminary exploration phase would be risky.
This is precisely the role of pilot projects: they act as pathfinders. By testing PQC within a limited scope, organizations can assess the technical impact, evaluate operational constraints, make informed choices regarding algorithms and architectures and, most importantly, mobilize teams well beyond the cybersecurity department. This testing phase is essential to preparing for PQC migration and anticipating its organization-wide impact.
1. Understanding the Impact on Performance
Before any large-scale deployment, it is essential to measure the tangible effects of PQC on your systems. The characteristics of the new PQC algorithms validated by NIST (such as ML-KEM, ML-DSA, or SLH-DSA) differ significantly from traditional algorithms (RSA, ECC) in regard to key sizes, signature lengths, and entirely new cryptographic paradigms (lattice-based, hash-based, code-based). These changes are not inconsequential: they can profoundly affect how use cases perform. That is why a targeted testing phase is critical to anticipating any necessary adjustments.
Latency, Memory, Bandwidth: Everything Is Impacted by PQC
Introducing post-quantum algorithms will have a direct impact on your systems’ technical parameters:
- Key and certificate size: a traditional RSA-2048 key weighs 256 bytes, whereas a PQC key may reach several thousand or even tens of thousands of bytes. The same applies to signatures
- Network impact: keys, signatures, and certificates exchanged during protocol negotiations (TLS, SSH, IPsec, etc.) become bulkier, increasing packet fragmentation, potentially raising latency while placing greater strain on communication channels
- Memory footprint: embedded systems or legacy environments, often resource-constrained, may be unable to process these heavier objects without adaptation
- Application performance: certain post-quantum cryptographic operations (signing, verification) may slow down critical processing if not properly integrated or optimized
Choosing the Right Post-Quantum Algorithm
Only tests conducted in real or representative environments can yield reliable insights:
- precisely quantifying performance gaps introduced by post-quantum algorithms.
- identifying situations where the impact is acceptable versus those where it becomes an operational bottleneck
- guiding algorithmic choices while accounting for business constraints, security requirements, and technical limitations specific to each environment
Not all post-quantum algorithms are equal in terms of performance. Some execute very quickly but require significant memory. Others are more resource-efficient but slower, having been designed to provide maximum security.
Therefore, there is no one-size-fits-all solution. This is exactly why pilot projects matter: they allow organizations to evaluate trade-offs based on their use cases and make informed technological decisions.
2. Anticipating Deployment Obstacles
Modern information systems form a dense, heterogeneous, and interconnected ecosystem that include business applications, databases, communication protocols, certificates, software libraries, network equipment, and more. Cryptography permeates it all, quite often in a way that is invisible, fragmented and poorly documented.
In this context, introducing post-quantum algorithms can easily disrupt established functional baselines. Unexpected issues are inevitable. Identifying them early helps avoid deployment roadblocks and limits additional cost due to last-minute adjustments.
Three Types of Pitfalls to Identify Before Deployment
1. Compatibility Issues
Not all equipment, libraries, or software are ready to handle the new formats introduced by PQC.
- Some systems do not yet support new key or certificate formats.
- Protocols such as TLS or IKE may fail during negotiation if versions, extensions, or configurations are incompatible.
2. System Interdependencies
A local change can have system-wide repercussions.
- Modifying a cryptographic component may invalidate signatures or cause authentication failures.
- Automated chains (code signing, deployment, CI/CD) may be disrupted by unfamiliar formats.
3. Business Process Disruptions
Certain critical functions rely on invisible cryptographic mechanisms that are central to everyday interactions.
- An unrecognized certificate or misinterpreted key can cause billing failures, block access to HR tools, or make customer support unavailable.
- Without prior testing, such disruptions may go unnoticed until a full production roll-out.
Testing as a Revealer
Testing helps eliminate uncertainty and uncover risks often invisible in a traditional audit by:
- detecting blocking points without disrupting production environments
- testing different migration scenarios based on use cases, systems, or algorithm choices,
- documenting technical prerequisites, critical dependencies, and required action sequences,
- anticipating regressions and unexpected service interruptions that can lead to operational losses.
3. Building PQC Expertise
Pilot projects are essential for developing strong internal expertise in post-quantum cryptography and for progressively engaging all organizational stakeholders in a way that can be maintained over time.
Migrating to PQC is not merely about swapping algorithms; it is a transformational initiative affecting technology, governance, IT strategy, and internal collaboration methods. Early testing creates the conditions for structuring this transition in a sustainable and aligned manner.
New Skills to Develop
New cryptographic standards rely on different mathematical foundations such as Euclidean lattices, error-correcting codes, and stateful or stateless hash functions. Integration requires a progressive enhancement of skills. With the end of the “single key” approach (such as RSA), each cryptographic function (key exchange, signing, encryption) must now rely on distinct algorithms chosen according to the usage context. This technical complexity demands adoption by multiple teams including cybersecurity, IT, architecture, development, compliance, and operations.
Establishing a Cross-Functional Team
Contrary to common belief, PQC migration is not only a cybersecurity concern, as it demands cross-functional expertise:
- IT architecture teams: to anticipate infrastructure impacts
- Developers: to adapt applications to new cryptographic algorithms
- Operations / DevOps: to integrate new cryptography into CI/CD pipelines
- Compliance / CISO teams: to ensure alignment with future regulatory requirements
- Executive leadership: to understand the stakes, allocate budgets, and support decision-making
The pilot project acts as both a trigger and a catalyst. It enables each stakeholder to grasp their responsibilities, confront technical realities, and build skills within a controlled framework. It often reveals an unmet need: a transversal coordination role dedicated to cryptographic asset management. This “cryptography manager” role—still absent from many organizational charts—becomes essential to orchestrating the transition toward agile, coherent, and long-term crypto governance.
Testing as a Cultural Lever
Testing PQC also initiates a cultural shift. Through feedback, documentation, cross-team exchanges, and identified challenges, pilot projects help build a shared cryptographic culture. This creates a common language, transforms practices, and strengthens cross-functionality. This is a powerful lever of change that prepares organizations to manage an increasingly strategic domain that had long remained in the background.
The Pilot Project: Enabling a Controlled Migration
The transition to post-quantum cryptography is far more complex than a simple update. It represents a deep overhaul of the cryptographic foundations underpinning organizational digital security.
Embarking on this migration without prior testing is akin to moving forward blindfolded, risking a poorly scoped, costly project that is slowed by technical surprises or even compromised from the very first phases of deployment.
By contrast, a well-designed pilot project reduces uncertainty. It clarifies technological choices, exposes organizational constraints, and maps out a path toward a gradual, structured, and successful migration. It is the first step- and an indispensable prerequisite-in building robust, agile, and sustainable post-quantum cybersecurity.
More categories: