Are we having fun yet?

https://arxiv.org/abs/2603.28627

Shor's algorithm is possible with as few as 10,000 reconfigurable atomic qubits

Quantum computers have the potential to perform computational tasks beyond the reach of classical machines. A prominent example is Shor's algorithm for integer factorization and discrete logarithms, which is of both fundamental importance and practical relevance to cryptography. However, due to the high overhead of quantum error correction, optimized resource estimates for cryptographically relevant instances of Shor's algorithm require millions of physical qubits. Here, by leveraging advances in high-rate quantum error-correcting codes, efficient logical instruction sets, and circuit design, we show that Shor's algorithm can be executed at cryptographically relevant scales with as few as 10,000 reconfigurable atomic qubits. Increasing the number of physical qubits improves time efficiency by enabling greater parallelism; under plausible assumptions, the runtime for discrete logarithms on the P-256 elliptic curve could be just a few days for a system with 26,000 physical qubits, while the runtime for factoring RSA-2048 integers is one to two orders of magnitude longer. Recent neutral-atom experiments have demonstrated universal fault-tolerant operations below the error-correction threshold, computation on arrays of hundreds of qubits, and trapping arrays with more than 6,000 highly coherent qubits. Although substantial engineering challenges remain, our theoretical analysis indicates that an appropriately designed neutral-atom architecture could support quantum computation at cryptographically relevant scales. More broadly, these results highlight the capability of neutral atoms for fault-tolerant quantum computing with wide-ranging scientific and technological applications.

arXiv.org
@sophieschmieg When people question the aggressive quantum readiness timelines given that 100 qubit computers are all we have today, I have to explain that it's not just a matter of building a computer with a million qubits, but that researchers are still publishing optimizations that may cut that by a factor of 10, or 100, or more. And we simply don't know if or when they'll figure out something better.
@targetdrone @sophieschmieg if we haven't learned from Y2K that preparing for shit quietly in the background pays off, we haven't learned anything.

@odr_k4tana

A lot of people think Y2K was a hoax because there was no huge apocalyptic disaster.

For some reason they find it difficult to believe that the huge apocalyptic disaster would have happened if not for the large, costly effort to fix the bugs *before* the big day.

@targetdrone @sophieschmieg

@argv_minus_one In fairness, for people who only have any memory of the 21st century I can understand how the idea of society coming together at scale and spending resources to tackle a foreseeable problem before it becomes a crisis might seem farfetched.
@odr_k4tana @targetdrone @sophieschmieg

@internic

Society didn't come together at scale. Society, for the most part, was panicked that the end of the world was nigh.

Business leaders are the ones who came together, presumably because they didn't want their businesses to abruptly screech to a halt on 2000-01-01, and hired an army of programmers to fix the bugs.

@odr_k4tana @targetdrone @sophieschmieg

@internic

Perhaps it's easier for business leaders to sigh and loosen the purse strings when the disaster (1) is absolutely certain to happen, and (2) will happen at an exact predetermined time.

There's no rationalizing inaction with “it'll be the next CEO's problem” when you know for sure exactly when it will happen and therefore exactly whose problem it will be.

@odr_k4tana @targetdrone @sophieschmieg