Compared with the Eagle, the Heron processor has improved device performance by three to five times, and its error rate has reached a record low, two-thirds lower than previous quantum processors. The global supercomputer development race is in full swing. IBM was caught off guard and threw out its trump card. On Monday, it launched the company's most powerful quantum computing chip so far, as well as the quantum computer IBM Quantum System 2, and formulated a grand blueprint for producing supercomputers in 2033.
Quantum computing chip, error rate hits record low
On December 4, local time, IBM launched the quantum computing chip "IBM Quantum Heron" (Heron) for the first time at the company's Quantum Summit.
The "Heron" processor has 133 fixed-frequency qubits, exceeding the 127 qubits of the "Eagle" processor.
IBM said that compared with the "Eagle", "
Next year, more Heron processors will join IBM's industry-leading fleet of utility-scale systems.
New modular system unveiled, supercomputers are not far away from becoming a reality
In addition, IBM also launched the company's first quantum computer, IBM Quantum System 2, with more than 1,000 qubits, which is equivalent to the qubits in ordinary computers. It is reported that,
IBM has demonstrated to the industry a new modular system that connects processors inside machines and then connects the machines together to form a modular system that, when combined with new error-correcting code,
"We are in an era where quantum computers are being used as tools to explore new frontiers in science," said Dario Gil, senior vice president and director of research at IBM.
"As we continue to advance quantum systems, scale and deliver value through modular architecture, we will further improve the quality of our utility-scale quantum technology stack and put it into the hands of our users and partners who will push the boundaries of quantum technology for more complex problems."
The key obstacle to quantum computing – the high probability of error
Compared with traditional computers, quantum computing uses quantum entanglement and superposition to achieve more powerful parallel computing capabilities, and the calculation speed is much faster.
However, these quantum states are also notoriously fickle and prone to error. To solve this problem, physicists try to coax multiple physical qubits (e.g., each physical qubit or a single ion encoded in a superconducting circuit) to collectively encode a single information qubit, a so-called "logical qubit."
Researchers generally say that state-of-the-art error correction techniques require more than 1,000 physical qubits per "logical qubit," and that a machine that can perform useful calculations would need millions of physical qubits.
But in recent months, physicists have become increasingly interested in an alternative error correction scheme called quantum low-density parity check (qLDPC).
According to preprint No. 1 from IBM researchers, this number will be reduced by a factor of 10 or more. The company said it will now focus on building the chip, which is designed to house some of the qLDPC-corrected qubits among the 400 or so physical qubits, and then connecting the chips together.
Mikhail Lukin, a physicist at Harvard University in Cambridge, Massachusetts, said IBM's preprint was an "excellent theoretical work."
"That being said, implementing this approach with superconducting qubits appears to be extremely challenging, and it may take years to even attempt proof-of-concept experiments on this platform," Lukin said.
The problem is that qLDPC technology requires each qubit to be directly connected to at least 6 other qubits. In a traditional superconducting chip, each qubit is connected to only 2-3 neighboring qubits.
But Oliver Dial, a condensed matter physicist and chief technology officer of IBM Quantum at the IBM Thomas J. Watson Research Center in Yorktown Heights, New York, said the company has a plan: It will add a layer of quantum chips to the design of its quantum computer to allow for the extra connections required by the qLDPC scheme.
Jay Gambetta, vice president of quantum at IBM, said the company has been taking a two-track approach to preparing the hardware, including developing the ability to manufacture high-quality qubits in large quantities on a sustained basis.
He said the Condor, which has more than 1,121 superconducting qubits, shows the company is in good shape on that front. IBM unveiled the processor on Monday.
"It's about 50% smaller qubits," Gambetta told the media. "The yield is there - our yield is close to 100%."
The second problem IBM has been working on is limiting the errors that occur when operating on single or pairs of qubits.
Changing the state of a qubit creates subtle signals that can leak into neighboring qubits, a phenomenon known as crosstalk. "Heron" is a smaller one among the new processors and represents four years of efforts by the IBM R&D team to improve gate performance.
"It's a beautiful device," Gambett said. "It's five times better than the previous device, has far fewer errors, and the crosstalk can't really be measured."
When will quantum computing be commercialized?
Although this quantum computing research is landmark, commercialization has so far been elusive.
"
In addition, by the end of 2024, IBM plans to establish eight quantum computing centers in the United States, Canada, Japan, and Germany to ensure that Quantum System Two is widely used by researchers.
Gambetta also said:
IBM researchers said recent advances have bolstered their confidence in the long-term potential of quantum computing, although they stopped short of predicting when quantum computing will enter the commercial mainstream.