"Despite great progress, the development of a universally applicable and fault-tolerant quantum computer is still a long way off. A major hurdle is the lack of scalability that simply increases the number of elementary arithmetic units, because the greater the number of these quantum bits, the more difficult it is to isolate and control each and every one of them from interfering influences, and the problems become more serious for each additional qubit.
Even the slightest interference, triggered by noise, thermal radiation or electromagnetic fields, can destroy the states of the quantum bits, which leads to calculation errors. To protect them from the environment, quantum bits are usually enclosed in a vacuum (atoms and ions) or extremely strongly cooled with liquid helium (superconducting quantum bits), which is associated with a great deal of technical effort.
That is the reason why the best prototypes currently have no more than 50 computing units (“Sycamore” from Google calculates with 52 superconducting qubits).
For a reasonably powerful system that can handle a wide variety of tasks faster than a classic computer, at least several hundred qubits would be required.
Scientists from the Max Planck Institute for Quantum Optics in Garching have a possible solution to the scaling dilemma. Instead of packing more and more quantum bits into a narrow space, they interconnect separated qubits that can be perfectly controlled and sealed off. In this way, the researchers working with Severin Daiss and Stefan Langenfeld have succeeded in using just two quantum bits to create an elementary quantum processor that can carry out simple logical operations. A special feature: the qubits are located in two separate laboratories and are connected via a 60 meter long glass fiber. "
Komentarų nėra:
Rašyti komentarą