Quantum computers still can’t do much. Almost every time researchers have found something the high-tech machines should one day excel at, a classical algorithm comes along that can do it just as well on a regular computer. One notable exception? Taking apart numbers. In 1994, the mathematician Peter Shor devised an algorithm that would let quantum computers factor big numbers exponentially faster than classical machines. That speedup matters because a fast-factoring algorithm could render most data-encryption methods useless. For more than 30 years, researchers have been trying to boost and guard against the power of future quantum computers.

But Shor’s factoring algorithm also has limitations: The bigger the number you want to factor, the bigger and better the quantum computer you need. Cracking an encryption scheme would require a quantum computer running Shor’s algorithm on hundreds of thousands (opens a new tab) of efficient quantum bits, or qubits. Today’s machines are nowhere close.

But a paper (opens a new tab) posted to the scientific preprint site arxiv.org describes how to factor any number with considerably fewer qubits: just one. In the new work, researchers show how to factor an integer of any size with a single qubit and three components known as oscillators — readily available devices typically associated with other quantum technology, like optics systems.

To be clear, it’s not a practical advance: The process requires exponentially more energy than a million-qubit quantum computer. But it does illuminate new ways of solving these kinds of problems. “This departs from the typical way we think about computing — and not just quantum computing, but classical computing as well,” said Ulysse Chabaud (opens a new tab), a computer scientist at the École Normale Supérieure in Paris who did not work on the new approach. “This seems crazy, if not impossible.”

To read more, click here.