April 22, 2026 ChainGPT

Streaming Quantum Method Cuts Qubit Needs to 60–300 — Crypto Keys at Risk Sooner

Streaming Quantum Method Cuts Qubit Needs to 60–300 — Crypto Keys at Risk Sooner
Quantum computers could soon play a practical role in training large-scale AI — and that prospect matters for crypto security, researchers say. A New Scientist report highlights a recent study from Caltech, Google Quantum AI, startup Oratomic, and MIT that addresses a major bottleneck: feeding massive datasets into quantum machines. AI training datasets often run to terabytes or petabytes, but to exploit quantum effects the data must be encoded as quantum states — a step that has traditionally demanded a large quantum memory footprint. The new approach avoids loading an entire dataset into quantum memory up front. Instead, it prepares the required quantum states on the fly during processing, dramatically cutting the memory burden while still enabling quantum features such as superposition. According to the researchers, this streaming-style method could let quantum processors tackle large datasets using far less memory than previously thought necessary. Performance estimates in the paper are striking. A fully error-corrected machine with roughly 300 logical qubits — logical qubits are fault-tolerant, error-corrected units used for reliable quantum computation — could outperform classical systems on some data-processing tasks relevant to AI. The team also suggests that even around 60 logical qubits might be enough to surpass classical approaches on certain AI workloads. Those lower-qubit thresholds do not mean such machines exist today, but they suggest practical quantum advantage for some data tasks may be closer than many expect. “Machine learning is really utilized everywhere in science and technology, and also everyday life. In a world where we can build this [quantum computing] architecture, I feel like it can be applied whenever there’s massive datasets available,” said Hsin-Yuan Huang, CTO of Oratomic. Oratomic co-founder and CEO Dolev Bluvstein noted the danger of complacency about timelines: historically, projections for quantum milestones have been conservative — for example, past estimates for running Shor’s algorithm called for orders-of-magnitude more qubits than we might need today. The study also highlights a feedback loop: AI tools are helping researchers model and analyze complex quantum systems, speeding progress on quantum hardware and algorithms. “The quantum machine is a very powerful device, but you do need to first feed it,” said Adrián Pérez-Salinas, professor of computational physics at ETH Zurich. “This study talks about feeding and how it’s enough to load [data] bit by bit, without overfeeding the beast.” Why this matters for crypto and blockchain: advances that lower the qubit and memory requirements for quantum advantage accelerate the timeline for quantum threats to public-key cryptography. Stronger, more practical quantum hardware that can efficiently process large datasets will make robust quantum-resistant cryptographic plans more urgent for blockchain projects, custodians, and developers. In short: a clever data-loading strategy could make quantum acceleration of AI tasks feasible with far fewer resources than once thought — and that shift tightens the race between quantum progress and cryptographic defense. Read more AI-generated news on: undefined/news