IBM Quantum breaks the 100‑qubit processor barrier
Today, IBM Quantum unveiled Eagle, a 127-qubit quantum processor. Eagle is leading quantum computers into a new era — we’ve launched a quantum processor that has pushed us beyond the 100-qubit barrier. We anticipate that, with Eagle, our users will be able to explore uncharted computational territory — and experience a key milestone on the path towards practical quantum computation.
We view Eagle as a step in a technological revolution in the history of computation. As quantum processors scale up, each additional qubit doubles the amount of space complexity — the amount of memory space required to execute algorithms — for a classical computer to reliably simulate quantum circuits. We hope to see quantum computers bring real-world benefits across fields as this increase in space complexity moves us into a realm beyond the abilities of classical computers. While this revolution plays out, we hope to continue sharing our best quantum hardware with the community early and often. This approach allows IBM and our users to work together to understand how best to explore and develop on these systems to achieve quantum advantage as soon as possible.
Constructing a processor that breaks the hundred-qubit barrier wasn’t something we could do overnight. Scientists for decades have theorized that a computer based on the same mathematics followed by subatomic particles — quantum mechanics — could outperform classical computers at simulating nature. However, constructing one of these devices is an enormous challenge. Qubits can decohere — or forget their quantum information — with even the slightest nudge from the outside world. Producing Eagle on our short timeline was possible in part thanks to IBM’s legacy of pioneering new science and investing in core hardware technology, including processes for reliable semiconductor manufacturing and packaging and bringing nascent products to market.
Eagle’s qubit count feat represents an important milestone on our IBM Quantum Roadmap. Eagle demonstrates how our team is solving challenges across hardware and software to eventually realize a quantum computer capable of solving practical problems in fields from renewable energy to finance and more. Quantum computation at scale
IBM Quantum’s Eagle processors contain nearly twice the qubits of our 65-qubit Hummingbird processor — but building something bigger takes more work than adding more qubits. We had to combine and improve upon techniques developed in previous generations of IBM Quantum processors in order to develop a processor architecture, including advanced 3D packaging techniques that we’re confident can form the backbone of processors up to and including our planned 1,000+ qubit Condor processor.
Eagle is based on our Heavy-hex represents the fourth iteration of the topology for IBM Quantum systems. Read more.heavy-hexagonal qubit layout as debuted with our Falcon processor, where qubits connect with either two or three neighbors as if sitting upon the edges and corners of tessellated hexagons. This particular connectivity decreased the potential for errors caused by interactions between neighboring qubits — providing significant boosts in yielding functional processors.
Eagle also incorporates readout multiplexing as featured in our Hummingbird R2. Previous processors required a set of control and readout electronics for each qubit — this is manageable for a few dozen qubits, but would be far too bulky for 100+, let alone 1,000+ qubit processors. Readout multiplexing allows us to drastically reduce the amount of electronics and wiring required inside of the dilution refrigerator.
Perhaps most importantly, Eagle incorporates past IBM expertise in classical processor fabrication to provide scalable access wiring to all qubits.
What do we mean? Quantum processors require a tangle of wiring that we must route outward to their edges. However, 3D integration allows us to place particular microwave circuit components and wiring on multiple physical levels. While packaging qubits remains one of the largest challenges for future quantum computers, multi-level wiring and other components provide the techniques that make possible the path toward Condor, with minimal impact to individual qubits’ performance.
There’s work yet to be done. The scale of a quantum chip is just one of three metrics that we use to measure the performance of a quantum processor, and we must continue to push the quality and speed of our processors by benchmarking their Quantum Volume and CLOPS is a metric correlated with how fast a quantum processor can execute circuits. Read more.Circuit Layer Operations Per Second (CLOPS), respectively. A modular paradigm: IBM Quantum System Two
As we continue scaling our chips, we expect them to mature beyond the infrastructure of IBM Quantum System One. Therefore, we’re excited to unveil a concept for the future of quantum computing systems: IBM Quantum System Two.
Central to IBM Quantum System Two will be modularity. With this system, we’re giving flexibility to our hardware to continue to increase the scale of our chips. The team is taking a holistic systems approach to understand the necessary resources to support not only our upcoming Osprey and Condor processors, but also quantum processors into the future, as we continue to progress along our hardware roadmap.
System Two introduces a new generation of scalable qubit control electronics together with higher-density cryogenic components and cabling. Furthermore, we are working jointly with Bluefors to re-imagine the cryogenic platform. Bluefors’ new cryogenic platform and its novel structural design optimizes space inside of the fridge in order to accommodate increased support hardware required by larger processors, while ensuring that engineers can easily access and service the hardware inside the fridge. IBM Quantum System Two: Design sneak preview
This platform brings the possibility of providing a larger shared cryogenic workspace, opening the door to potential linking of quantum processors through novel interconnects. We think that System Two represents a glimpse into the future of what quantum computing looks like — a true quantum data center.
Breaking the 100-qubit barrier is an incredible feat from the IBM Quantum team, and we’re looking forward to sharing Eagle and our other advances with the quantum computing community. There’s more to come as we progress along our IBM Quantum roadmap, from increases in the speed of our processors to pursuing quantum advantage perhaps even quicker than expected with the help of high-performance computing resources.
We hope you’ll join us as we continue our journey, with the goal of scaling quantum computers into paradigm-shifting systems capable of solving some of the most pressing challenges the world faces today. Even grander things await.
--IBM, 16/11/2021
No, it is significantly less. I think it is of order 100 for common encryption schemes, but I haven't gone back to the calculations since my last quantum mechanics class, and ~30 is the crossover for several other useful things like simulating other quantum systems. The issue is that these need to be logical qubits, and the current methods of error correction and dealing with decoherence is encoding the quantum information in many qubits per logical qubit. This is a press release very light on details, so I am guessing that this is 127 physical qubits wired up in such a way that multiple error correcting codes and schemes can be applied to this system, so the number of logical qubits is in the 10s, if it were to be used for encryption.
The other thing is that is important to note is that there are newer encryption methods, not based on factoring large numbers, that is supposedly equal or harder to decrypt using quantum computers. But I don't remember too much details on that, and that might be where the hundreds of thousands figure comes from.