Quantum Computing
Quantum computing has moved from theoretical to early commercial deployment in 2020-2025. Major players (IBM, Google, IBM, Quantinuum, IonQ, Rigetti, PsiQuantum, Atom Computing) are racing to scale qubit counts and error rates. Cryptography implications are pressing — RSA-breaking quantum capability could arrive within 10-20 years, forcing transition to post-quantum cryptography.
Key insights
Qubit count is the headline race, fidelity is the real metric
Headline qubit counts have grown rapidly — IBM 1,121 (Condor), Atom Computing 1,180, Google's Willow 105 with major error-correction breakthrough. But error rates limit useful computation. Today's processors are 'noisy intermediate-scale quantum' (NISQ) — too noisy for general-purpose quantum applications. Logical qubits (error-corrected combinations of physical qubits) are the metric that matters; estimated 1,000-10,000 physical qubits needed per logical qubit.
Cryptography stakes are existential
Shor's algorithm — discovered 1994 — would break RSA encryption and elliptic-curve cryptography in polynomial time on a sufficiently large quantum computer. RSA-2048 requires roughly 4,000 logical qubits — likely 4-10 million physical qubits. Today's quantum computers are 1,000-10,000× too small. But the gap is narrowing. NIST released first post-quantum cryptography standards in August 2024 (ML-KEM, ML-DSA, SLH-DSA). Migration to post-quantum is now underway across major systems.
Commercial applications are emerging but narrow
Quantum advantage demonstrated in specific narrow problems (Google Sycamore 2019; subsequent benchmarks contested). Useful commercial applications: optimization (logistics, finance), molecular simulation (drug discovery, materials science), machine learning components. None has produced clear 'kill app' that exceeds classical performance at meaningful cost. The 'quantum winter' concern — that commercial value lags hype — is being discussed but not yet materializing as serious.
Quantum processor qubit count — leading systems
Number of qubits in flagship processors
Key Finding: Qubit count has grown ~10× in 5 years. But fidelity is the real bottleneck.
Quantum computing investment 2015–2024
USD billions in public + private quantum tech investment
Key Finding: Investment surged ~2020-2024 with multiple billion-dollar startups (PsiQuantum, IonQ).
Methodology & caveats
Qubit types
Superconducting (IBM, Google): cryogenic circuits. Trapped ion (IonQ, Quantinuum): individual atoms suspended in EM fields. Photonic (PsiQuantum, Xanadu): light particles. Neutral atom (QuEra, Atom Computing): cold atoms in optical traps. Each technology has different scaling characteristics, error profiles, connectivity. None is yet clearly dominant; the race is ongoing.
Error correction
Quantum error correction creates 'logical qubits' that are robust against individual physical-qubit errors. Surface codes are the leading approach; require ~1,000-10,000 physical qubits per logical qubit. Google's Willow (December 2024) showed first below-threshold scaling — error rates dropping as code distance increases. This is the precondition for practically useful fault-tolerant quantum computing.
Timeline uncertainty
Predictions for cryptographically-relevant quantum computing range from 5-30 years. NIST/NSA recommended migration to post-quantum cryptography by 2035, treating mid-2030s as plausible timeline. 'Harvest now, decrypt later' attacks already collect encrypted traffic for future decryption — making post-quantum migration urgent for high-value long-lived secrets.