A new theoretical study identifies fundamental tradeoffs that limit the amount of noise reduction in quantum information systems.
Quantum information technology could perform certain tasks faster and more securely than classical methods, but noise is a constant obstacle. Several so-called “purification” schemes exist that can lower the noise, and efforts are ongoing to improve them. But a new study shows that there are fundamental limits on the efficacy of these schemes.
Researchers from the University of Cambridge, UK, and from the Perimeter Institute for Theoretical Physics, Canada, have considered the purification of general kinds of quantum “resources,” which could be entanglement, coherence, or some other quantum property. They have showed that any scheme will be limited by the laws of quantum mechanics, which impose tradeoffs between the desired noise reduction and the efficiency of purification.
To gain a sense of what purification is, imagine that Alice—the quantum-technology poster girl—is computing with entangled photons that are corrupted by environmental interactions. One way that she can purify her input is to generate many copies of the noisy states and then distill a fraction of less noisy states from the larger sample. A typical distillation routine might involve measuring some of the copies to determine which of the unmeasured states have less noise. The more copies generated, the lower the noise can be reduced. However, copies cost time and energy, so Alice has to decide how much purification she can afford. (APS Physics)
The authors believe that these fundamental limits shed light on the price of practical quantum technologies while also serving as a benchmark for evaluating a given purification scheme.
This research has been published in Physical Review Letters.