Concepts from gauge theory could lead to a more efficient way to perform fault-tolerant quantum computation by reducing the number of qubits required for key operations – according to work done by Dominic Williamson and Theodore Yoder at IBM Quantum in the US.
By adapting ideas from gauge theory, the researchers show how quantum information spread-out across a machine can be measured using only local checks, significantly lowering computing overhead. Their approach works for a wide class of quantum error-correction codes and could help accelerate the development of practical quantum computers.
To read more, click here.