Distributed Quantum information processing using multiplexed photons

Spread the love

Distributed quantum information processing is based on the transmission of quantum data over lossy channels between quantum processing nodes.

These nodes may be separated by a few microns or on planetary scale distances, but transmission losses due to absorption and/or scattering in the channel are the major source of error for most distributed quantum information tasks.

Of course, Quantum Error Correction (QEC) and detection techniques can be used to mitigate such effects, but error detection approaches have severe performance limitations due to the signaling constraints between nodes, and so error correction approaches are preferable—assuming one has sufficient high quality local operations.

Typically, performance comparisons between loss-mitigating codes assume one encoded qubit per photon. However, single photons can carry more than one qubit of information.

The scientists, based in Japan, explored whether loss-based QEC codes utilizing quantum multiplexed photons are viable and advantageous, especially as photon loss results in more than one qubit of information being lost.

They showed that quantum multiplexing enables significant resource reduction, in terms of the number of single-photon sources, while at the same time maintaining (or even lowering) the number of 2-qubit gates required. Further, their multiplexing approach requires only conventional optical gates already necessary for the implementation of these codes.

The paper has been published in Physical Review Letters.