Two papers came out last week that suggest classical asymmetric cryptography might indeed be broken by quantum computers in just a few years.

That means we need to ship post-quantum crypto now, with the tools we have: ML-KEM and ML-DSA. I didn't think PQ auth was so urgent until recently.

https://words.filippo.io/crqc-timeline/

A Cryptography Engineer’s Perspective on Quantum Computing Timelines

The risk that cryptographically-relevant quantum computers materialize within the next few years is now high enough to be dispositive, unfortunately.

@filippo I'm still firmly in the camp of those believing that QC is largely stock market manipulation and a snake oil fuelled research grant grift... BUT...
Equally, if we have the post-QC crypto math, it only makes sense to use it. I don't see thee downside.
@tmcfarlane I think Scott Aaronson frequently makes the case that the answer to "stock market manipulation or actual progress" is "both, by different companies."
@filippo Both algorithms have not been extensively tested and analysed. It could be a significant higher risk that they are broken on classical computers than there is a quantum computer that can do what it stated by the papers. Instead of having quantum computer validating this risk in practice they only work on artificial irrelevant problems (not actually trying to break keys). It would be good to see some real case (even small) where they try do it - this would help to understand the risk.
@filippo Quote from a paper that you cite: ", our most
time-efficient architectures can potentially enable run-
times of 10 days for ECC–256 with ≈ 26,000 qubits, and
97 days for RSA–2048 with ≈ 102,000 qubits"
This is for one key! If all "substantial engineering challenges" are solved.
It was not the scope of your post, but a broader assessment at Confidentiality, Integrity, Availability risks with some concrete estimations would help (which is maybe more a job for a IT Security Risk Manager).

@jornfranke I encourage you to reread the article because it addresses all your objections, especially the "why did they not break a small key".

I will add that the cryptography experts are actually very confident in the security of lattices. https://keymaterial.net/2025/12/13/a-very-unscientific-guide-to-the-security-of-various-pqc-algorithms/

A very unscientific guide to the security of various PQC algorithms

After publishing my series on UOV, one feedback I got was that my blog posts made people feel more confident in the security of the scheme, because “at least someone is looking into these thi…

Key Material
@filippo I found no good argument for this. There is just a bogus comment that nobody asked the Manhatten project to create a small nuclear explosion. It has nothing to do with the topic and they of course did various tests and experiments to validate what they are doing.
It is a typical distraction from the fact that they even cannot solve small problems on QC.
@filippo Cryptographic experts might be confident in the security of lattice, but I would be not confident in their secure implementation. It took decades to get the implementation right for classical algorithms and they are still often wrongly implemented. This is a big security problem.
@jornfranke I am a cryptography engineer so I can tell you from experience: no, ML-KEM and ML-DSA are easier to implement and easier to test than all their classical alternatives.
@filippo I refer with implementation not in a specific library or tool, but a whole security system, e.g. integration into applications, into security operations. They will require significant changes in all aspects of the system and their operation.
@filippo Do not get me wrong. I believe we need to be crpyto-agile as at the moment it is even a mess to update the algorithm of one application to a latest version. Here I also agree with Bruce Schneier (https://www.schneier.com/blog/archives/2026/04/google-wants-to-transition-to-post-quantum-cryptography-by-2029.html). However, one should not do this in panic mode and get that one first right before moving to so impacting changes in an organisation.
Google Wants to Transition to Post-Quantum Cryptography by 2029 - Schneier on Security

Google says that it will fully transition to post-quantum cryptography by 2029. I think this is a good move, not because I think we will have a useful quantum computer anywhere near that year, but because crypto-agility is always a good thing. Slashdot thread.

Schneier on Security

@filippo @jornfranke I wouldn't say they are necessarily easier, but we stand on the shoulder of giants, these days we have tools and knowledge the RSA pioneers hadn't even dreamt of yet, so a lot of the pitfalls are much easier to avoid.

The cryptography community also builds bullet proof algorithms these days, not just primitives everyone has to figure out how tp make secure by themselves.

All that said ML-KEM and ML-DSA's power and timing side-channel analysis is much harder, but doable!

@jornfranke
And where are we in the usable qubits race?
@filippo

@yacc143 @jornfranke @filippo

This is a tricky question because the definitions shift. A lot of these algorithms require all of the qubits to be in the same quantum circuit, but not all devices allow this and two smaller quantum computers do not combine to make a single large one in the way classical ALUs can. A lot of the recent publications have been focused around using multiple physical qubits to handle the various error conditions and present as a single logical qubit, so a system with a thousand qubits may only be one qubit from the perspective of a quantum computing algorithm that’s modelled on a perfect quantum computer.

@david_chisnall what about quantum gates? Aren’t they as important as the number of required bits?

@yacc143 @jornfranke @filippo

@filippo Can’t wait for the first quantum attack on cryptocurrency. The minute North Korea gets their hands on a large enough quantum computer, that one’s happening.

@filippo

Considering the stakes, combined with the scope of resources some superpowers possess, plus the "disclose this and it might cost you your _life_" level of "nda" they enforce, I wouldn't count on "a few years". Might already be here?

I suspect anyone who might have the actual info as to what the state-of-the-art crypto-breaking capabilities are at the level of a military superpower, they certainly aren't in a position to talk openly about it.

@filippo Couldn't agree more with "the bet is 'are you 100% sure a CRQC will NOT exist in 2030?'" — and I'd also add the operational perspective: "are you 100% sure you've found and replaced every Debian oldoldstable and RHEL 8 box that doesn't support PQC by 2030?"
@neverpanic oh Debian oldstable is not gonna make it. stable might not make it! I have a secret, over-optimistic wish that this will kill the "constantly run software 3-5 years out of date" model of distribution, and free us upstreams from having to deal with its fallout, but I know it won't.
@filippo @neverpanic That model is the sad result of upstreams' inability to provide / disinterest in providing stable interfaces. (I don't know how Go is in this regard.)
Go 1 and the Future of Go Programs - The Go Programming Language

@neverpanic @filippo
We inventoried _everything_ once for Y2K and once for DST 2007 🇺🇸 , at least here.

We _can_ and _will_ do it again.
(Doing it for both CRQC and Y2038 simultaneously would be a cost-savings.)

(If you can't make a 100% inventory, ask your Red Team or contact a gray-hat hacker, see if they'll take $ for becoming white-hat red-vest on Red Team.)

@filippo @robpike Here's an NSA publication on this topic, from 10 years ago. What I love about this is how they describe their requirements: they have to field systems and guarantee their security for 30 years into the future.

https://archive.org/details/cnsa-suite-and-quantum-computing-faq/mode/2up

CNSA Suite And Quantum Computing FAQ : National Security Agency : Free Download, Borrow, and Streaming : Internet Archive

NSA Guidance on crypto algorithms, and defending against quantum computing.

Internet Archive

@filippo What about WebAuthn, Passkeys, etc?

I don't see any movement in that side of the pond. Just as we are convincing everyone to switch to them

@arianvp I do think they should get moving. But also, a passkey with a broken signature algorithm is still more secure than a password: the attacker needs the public key to fake a signature, and that's only in the website's database. I think it should still be phishing-resistant, too.
@filippo yeh I guess the privacy-preserving aspects of the WebAuthn API paid off here.
@arianvp @filippo There is movement in that area already, standards are being updated, and a few vendors seem to have development hardware tokens already, but it'll be a while until this becomes widely available.
@filippo I thought the hybrid scenario was if ML-KEM is broken conventionally?
@timbray yeah, if ML-KEM is broken classically before the CRQCs arrive (after which hybrid doesn't help you anymore). Which part are you responding to?

@filippo
> In symmetric encryption, we don’t need to do anything, thankfully

Ok. So the IoT garden must rewind to the Kerberos or physical provisioning. I can't imagine lattices on small silicon yet.

Thank you, Filippo.

@ohir @filippo I’m involved in modernizing crypto for protocol that needs to work at 9600 baud. These key sizes are going to be a terrible problem… and that’s just the bandwidth side, them there”s RAM and CPU on small embedded devices.
@mikaeleiman @ohir yup, it's not great, but it is what it is. (RAM is kinda fine if you optimize for it, and CPU is actually faster than classical. But yes, size sucks.)
@filippo @mikaeleiman @ohir Small devices which are part of a bigger system, but which are unable to take care of themselves should be behind a gateway/proxy anyway.
@KoosPol @filippo there’s no gateway to hide behind in this case unfortunately, it’s all small devices (and no Internet involved). And we’d like to be able to firmware update existing devices to the new stuff, replacing with new hardware would be very expensive.
@mikaeleiman @filippo Then you have an system architecture problem. You can't have important devices in operation without proper safeguards. And if one of the safeguards is crypto, while these devices can't deliver, then you need to tuck them behind a gateway. And if you can't then the devices aren't important enough. You can't have the cake and eat it.

@KoosPol @filippo That's why I'm looking forward to someone coming up with a quantum resistant variant of key exchange with smaller keys that's workable for smallish embedded MCUs :]

Currently it's looking like there's no way forward for our users that's not expensive, disruptive and time consuming.

@KoosPol @filippo @mikaeleiman
> should be behind a gateway/proxy anyway

IoT devices are not expected to have any external protection. This is a consumer market. We can not expect home owner to provide secure environment before they change old dumb LED-bulb for a new one equipped with BLE, motion sensors and microphones. Whether they use "smart" features of this bulb or not.

Then we have many mandated by law devices that must be put in the car for safety and for national security reasons. A prime example are the direct TPMS sensors. E.g. a hackable TPMS in the tire will not deny our abilities to accurately monitor the good actors movements, but it may afflict our ability to track movements of the bad actors were they able to tamper with emitted by the tire IDs to mimic a secure-facility employee's car.

@mikaeleiman
> there”s RAM and CPU on small embedded devices.

This. You can think of attesting just KDC with ML signature, likely, on some bigger silicon, but not on a half of coming cheap STM lines for example.

Time to study rfc4120 and siblings :(

@filippo

I'd like to ask whether migrating Ed25519 to Ed448 makes any sense? At least temporarily. Setting aside all that owful implementation disadvantages of Ed448.

@ohir @mikaeleiman nope, the distance between now and Ed25519 is way larger than between Ed25519 and Ed448.

@ohir @filippo FN-DSA may be a solution for this class of devices.

(Or LMS and XMSS, although those two are such a big footgun nobody should be using them, *especially* not for signing — verification is OK.)

@neverpanic @filippo
> LMS/XMSS
I'd bet only on something that will get the most attention of the cryptographers community. And certainly I am quite in shock having past gossips coming to the reality now.

If Ed448 is off, and larger NIST curves too, then small cpus over the slow links are toasted at least re authentication. Likely. We need to sit, think, cool down.

And I think that now the threat model assessment will be the most important part of the design procedures. Do we need secure against the gov/corporate adversary of a big country, or small one. Do breach afflict just one site or all users and so on.

@filippo I still think hybrid is the way to go. PQ crypto algorithms and their implementations are still very new, with undiscovered flaws. If you use hybrid and PQ is broken by a bug or flaw, no problem, you still have the same protection or better than the classical one.
Even when quantum computers exist you'd have to break both the classical one (with a quantum computer) and the PQ one (with an implementation flaw, or mathematical breakthrough).
If you deploy only PQ and a flaw is found you are *worse* than classical, depending on how bad the flaw is you might not be much better from transmitting in plain text.

IOW a PQ crypto algoritm protects against an attack from a machine which doesn't yet exist. Deploying it standalone makes you vulnerable against a bug that doesn't yet exist. *But* we've seen a steady stream of bugs in OpenSSL, and it is very likely that there will be one in the PQ implementation too.
I think it is more likely that such a bug is discovered before a quantum computer is built that is capable of a practical attack.

For example there could be side channel attacks if you forget to implement protections similar to RSA blinding (constant time CPU instructions are not side-channel free, see latest Hertzbleed attack from 2025 about remote power analysis leaks). And there probably plenty of other "classical" attacks that will work on PQ algorithms too, since they execute on a classical computer...

Of course implementation flaws in a classical+PQ hybrid could be worse off than just classical too (e.g. some C memory bug), but that might be an acceptable risk.

I'm not sure what the best ordering for a hybrid would be, but I guess PQ encryption first, then classical? So you always have to break the classical first (which won't be instant, even with quantum computers).

There is of course a performance cost, but AFAICT encryption isn't really the bottleneck in TLS, from some testing with 'curl' and 'stunnel' they achieve much lower speeds than what 'openssl speed' reports, so increasing encryption time may not affect overall time that much.

@edwintorok @filippo i don't have the toot at hand. But @djb is very vocal about hybrid PQ being the NSA's wet dream.

@KoosPol @edwintorok @filippo @djb

I think he's saying that non-hybrid ML-KEM is the NSA's wet dream, in this post and others: https://blog.cr.yp.to/20260221-structure.html

@kbr @edwintorok @filippo @djb Thank you for the correction! (and never hire me as crypto consultant)

@filippo you mentioned that file encryption is particularly vulnerable to store-now-decrypt-later – do i understand correctly that e.g. an old passage database which uses pre-quantum recipients is vulnerable if the encrypted files are leaked or compromised?

the mitigation here would be to migrate to a new passage db using *only* post-quantum-proof recipients *and then also* rotate all the passwords in the database, right?

@timezone I'm afraid so, yes. (With the asterisk that the attacker also needs the public key to use a QC.)

@filippo Thanks for the post!

Regarding AES128: is it safe *for the moment* or do you think it should be safe for a long time yet ?

And what's your opinion on pre-shared keys ?

I often work on web push, and its encryption (RFC8291) uses an "authentication secret" that if, I understand correctly, works as a pre-shared key (cf. bellow), and AES128. I wonder if it's safe to continue with this protocol

PRK_key = HMAC-SHA-256(auth_secret, ecdh_secret);
IKM = HMAC-SHA-256(PRK_key, key_info || 0x01)

@S1m it's as safe as it always was, and as safe as if QCs were impossible. (Which is to say very safe, no one really thinks AES will get surprise broken.)

\PSKs are fine if you can keep them from being compromised. The scheme you excerpted should be fine if auth_secret is not known to the attacker.

@filippo Thanks a lot for the answer!

So, that's nearly what I had in mind. I wasn't sure for the AES key length: I remember reading that 128 was enough in some places, and that we should move to 256 in some others

@filippo
I need to figure out how to use a yubikey and a PQ key (in a file) for age/passage now :)
@filippo The references generally discuss the matter in form of theoretical models. Having an engineers theory v practice mindset, I question whether there is any correlation between advances in these theoretical models and actual, physical quantum computers that dont’t solely exist in highly controlled lab enviroments?

@filippo Is anything pointing to actual quantum computers being made?

Dr Hossenfelder has made a few videos on the topic that at least convinced me they're likely never going to be able to scale as much as needed.