Scientists have achieved the lowest quantum computing error rate ever recorded. This is an important step in solving the fundamental challenges for practical, utility-scale quantum computers.
In a study published in the APS Physical Review Letter Journal on June 12, scientists showed a quantum error rate of 0.000015%.
This achievement represents an almost single order of improvement in both fidelity and speed, with previous records of approximately one error per million operations achieved by the same team in 2014.
You might like it
The prevalence of errors or “noise” in quantum operations can render the output of quantum computers useless.
This noise comes from a variety of sources, including flaws in the way of control (basically problems with computer architecture and algorithms) and laws of physics. That’s why a considerable amount of effort is being put into correcting quantum errors.
Errors related to natural law, such as decoherence (natural collapse of quantum states) and leakage (qubit states leaking from computational subspaces), can only be reduced within these laws, but team progress has been achieved by reducing the noise generated by computer architecture and control methods to nearly zero.
Related: Scientists will create a breakthrough in “magical state” in 20 years – without it, quantum computers will never be really useful
“By significantly reducing the likelihood of errors, this work will significantly reduce the infrastructure required for error correction, opening up ways for future quantum computers to become smaller, faster and more efficient,” says Molly Smith, a graduate student at Oxford University and a co-author of the study. “Kitz’s precise control is useful for other quantum technologies, such as watches and quantum sensors.”
Recording low quantum computing error rate
The quantum computers used in the team’s experiments relied on a bespoke platform that avoided the more common architecture that used photons as qubits (quantum equivalent to computer bits) because of qubits made of “trap ions.”
The study will also be conducted at room temperature, and researchers said it will simplify the setup required to integrate the technology into a working quantum computer.
Most quantum systems either deploy superconducting circuits that rely on “quantum dots” or employ the use of lasers called “optical tweezers” to trap a series of calcium 43 ions using a microwave oven to place a single photon for manipulation.
This approach places the ions in the “atomic clock” state of the hyperfin. This study has shown that this technique has allowed researchers to create more “quantum gates.” This is similar to the number of “quantum operations” that a computer can perform, and can be done more accurately than the allowed photon-based methods.
Once the ions were placed in a hyperfin atomic clock state, the researchers calibrated the ions via an automated control procedure that periodically corrected the amplitude and frequency drift caused by microwave control methods.
In other words, researchers have developed an algorithm that detects and corrects noise generated by the microwaves used to capture ions. By removing this noise, teams can use the system to perform quantum operations at or near the lowest possible error rate physically possible.
Using this method, we can now develop quantum computers that can perform single-gate operations (those performed on a single qubit gate as opposed to gates that require multiple qubits) on a large scale with zero error.
This could lead to more common quantum computers more efficiently, and research explains most errors generated in single-gate operations to achieve a breakdown of new cutting-edge single-kit gate errors and all known sources of error.
This means that engineers building quantum computers with trapped ion architectures and developers creating algorithms that run them do not need to devote many qubits to the sole purpose of error correction.
By reducing errors, new methods reduce the number of qubits required and the cost and size of the quantum computer itself, researchers said in a statement.
However, while this is not a panacea for the industry, many quantum algorithms require multi-gate qubits that work with or form a single-gate qubit to perform calculations beyond the basic function. The error rate for the 2 quit gate function is still about 1/2,000.
This study represents an important step towards practical, utility-scale quantum computing, but does not address all the “noise” problems inherent in complex multigate kit systems.
Source link