IBM Quantum breaks the 100‑qubit processor barrier
Today, IBM Quantum unveiled Eagle, a 127-qubit quantum processor. Eagle is leading quantum computers into a new era — we’ve launched a quantum processor that has pushed us beyond the 100-qubit barrier. We anticipate that, with Eagle, our users will be able to explore uncharted computational territory — and experience a key milestone on the path towards practical quantum computation.
We view Eagle as a step in a technological revolution in the history of computation. As quantum processors scale up, each additional qubit doubles the amount of space complexity — the amount of memory space required to execute algorithms — for a classical computer to reliably simulate quantum circuits. We hope to see quantum computers bring real-world benefits across fields as this increase in space complexity moves us into a realm beyond the abilities of classical computers. While this revolution plays out, we hope to continue sharing our best quantum hardware with the community early and often. This approach allows IBM and our users to work together to understand how best to explore and develop on these systems to achieve quantum advantage as soon as possible.
Constructing a processor that breaks the hundred-qubit barrier wasn’t something we could do overnight. Scientists for decades have theorized that a computer based on the same mathematics followed by subatomic particles — quantum mechanics — could outperform classical computers at simulating nature. However, constructing one of these devices is an enormous challenge. Qubits can decohere — or forget their quantum information — with even the slightest nudge from the outside world. Producing Eagle on our short timeline was possible in part thanks to IBM’s legacy of pioneering new science and investing in core hardware technology, including processes for reliable semiconductor manufacturing and packaging and bringing nascent products to market.
Eagle’s qubit count feat represents an important milestone on our IBM Quantum Roadmap. Eagle demonstrates how our team is solving challenges across hardware and software to eventually realize a quantum computer capable of solving practical problems in fields from renewable energy to finance and more. Quantum computation at scale
IBM Quantum’s Eagle processors contain nearly twice the qubits of our 65-qubit Hummingbird processor — but building something bigger takes more work than adding more qubits. We had to combine and improve upon techniques developed in previous generations of IBM Quantum processors in order to develop a processor architecture, including advanced 3D packaging techniques that we’re confident can form the backbone of processors up to and including our planned 1,000+ qubit Condor processor.
Eagle is based on our Heavy-hex represents the fourth iteration of the topology for IBM Quantum systems. Read more.heavy-hexagonal qubit layout as debuted with our Falcon processor, where qubits connect with either two or three neighbors as if sitting upon the edges and corners of tessellated hexagons. This particular connectivity decreased the potential for errors caused by interactions between neighboring qubits — providing significant boosts in yielding functional processors.
Eagle also incorporates readout multiplexing as featured in our Hummingbird R2. Previous processors required a set of control and readout electronics for each qubit — this is manageable for a few dozen qubits, but would be far too bulky for 100+, let alone 1,000+ qubit processors. Readout multiplexing allows us to drastically reduce the amount of electronics and wiring required inside of the dilution refrigerator.
Perhaps most importantly, Eagle incorporates past IBM expertise in classical processor fabrication to provide scalable access wiring to all qubits.
What do we mean? Quantum processors require a tangle of wiring that we must route outward to their edges. However, 3D integration allows us to place particular microwave circuit components and wiring on multiple physical levels. While packaging qubits remains one of the largest challenges for future quantum computers, multi-level wiring and other components provide the techniques that make possible the path toward Condor, with minimal impact to individual qubits’ performance.
There’s work yet to be done. The scale of a quantum chip is just one of three metrics that we use to measure the performance of a quantum processor, and we must continue to push the quality and speed of our processors by benchmarking their Quantum Volume and CLOPS is a metric correlated with how fast a quantum processor can execute circuits. Read more.Circuit Layer Operations Per Second (CLOPS), respectively. A modular paradigm: IBM Quantum System Two
As we continue scaling our chips, we expect them to mature beyond the infrastructure of IBM Quantum System One. Therefore, we’re excited to unveil a concept for the future of quantum computing systems: IBM Quantum System Two.
Central to IBM Quantum System Two will be modularity. With this system, we’re giving flexibility to our hardware to continue to increase the scale of our chips. The team is taking a holistic systems approach to understand the necessary resources to support not only our upcoming Osprey and Condor processors, but also quantum processors into the future, as we continue to progress along our hardware roadmap.
System Two introduces a new generation of scalable qubit control electronics together with higher-density cryogenic components and cabling. Furthermore, we are working jointly with Bluefors to re-imagine the cryogenic platform. Bluefors’ new cryogenic platform and its novel structural design optimizes space inside of the fridge in order to accommodate increased support hardware required by larger processors, while ensuring that engineers can easily access and service the hardware inside the fridge. IBM Quantum System Two: Design sneak preview
This platform brings the possibility of providing a larger shared cryogenic workspace, opening the door to potential linking of quantum processors through novel interconnects. We think that System Two represents a glimpse into the future of what quantum computing looks like — a true quantum data center.
Breaking the 100-qubit barrier is an incredible feat from the IBM Quantum team, and we’re looking forward to sharing Eagle and our other advances with the quantum computing community. There’s more to come as we progress along our IBM Quantum roadmap, from increases in the speed of our processors to pursuing quantum advantage perhaps even quicker than expected with the help of high-performance computing resources.
We hope you’ll join us as we continue our journey, with the goal of scaling quantum computers into paradigm-shifting systems capable of solving some of the most pressing challenges the world faces today. Even grander things await.
--IBM, 16/11/2021
This is a glorious feat of science but still depresso because we all know it was built from the blood of slaves and that the bloodsucking reptiles in charge will only use it to satisfy the edritch whims of the market.
So is encryption broke yet or what.
That's gotta be the first thing this will be used for by the government
No they still have ways to go until it could feasibly be used for breaking current encryption algorithms, I think I've heard a number of a few hundred thousand qubits or so.
No, it is significantly less. I think it is of order 100 for common encryption schemes, but I haven't gone back to the calculations since my last quantum mechanics class, and ~30 is the crossover for several other useful things like simulating other quantum systems. The issue is that these need to be logical qubits, and the current methods of error correction and dealing with decoherence is encoding the quantum information in many qubits per logical qubit. This is a press release very light on details, so I am guessing that this is 127 physical qubits wired up in such a way that multiple error correcting codes and schemes can be applied to this system, so the number of logical qubits is in the 10s, if it were to be used for encryption.
The other thing is that is important to note is that there are newer encryption methods, not based on factoring large numbers, that is supposedly equal or harder to decrypt using quantum computers. But I don't remember too much details on that, and that might be where the hundreds of thousands figure comes from.
probably for basic stuff with weaker encryption methods, like shitty phone encryption. if you go ham on encrypting something its probably still fine right now
The fools are unknowingly building the tools for our own liberation, Cybernetics gang rise up :red-fist: :cat-com: :cyber-lenin:
Honestly for a planned economy you don't really need anything more than what we already have, classical computers are perfectly capable of running all necessary economic calculations, despite the mental gymnastics of libertarian economists.
But having linear time solution for traveling salesperson would be huge
It's hardly necessary, current algorithms for calculating routes for delivery and whatnot work just fine because actual real life routes don't involve weird theoretical edge cases that would break our current best-effort heuristic algorithms.
Consider this smart people with good algorithms doing central planing vs mediocre programmers able to use bad polynomial algos for the same things
Programming for quantum computers is anything but simple though, it's not like you could write a brute-force algorithm and have the quantum computer magically solve it in linear time.
Notice how they never actually explain what they think a quantum computer will be better at than a regular one? That's because beyond being able to break an RSA encryption and run quantum mechanics experiments, there's not really much we know how to do with this that would be better than a regular computer. I think this is just an excuse to shower tons of money on military contractors.
That’s because beyond being able to break an RSA encryption
MAKE NFTS FUNGIBLE AGAIN
its mostly numbers stuff that its good at vs a typical computer. just does massive computations way faster. its likely that if you require this sort of thing you might go buy a qpu instead of a gpu or something
It literally doesn't, to the best of my knowledge. That would require an algorithm that takes advantage of quantum phenomena to more efficiently perform operations, and I don't think there is one.
I recall reading some white paper that there's some optimization algos that are useful in ML that could be done much faster with quantum computing.
As far as I know, that is mostly speculative, but most of the machine learning stuff, I know of, is based on a D-wave type quenching system. Which I think is not directly taking advantage of quantum phenomena, per se, but making the analogies of optimization problems to annealing more explicit. Though ML does involve interaction with a lot of coupled/interacting variables, so there might be quantum algorithms I am not aware of.
Now that seems plausible, there are just way too many people who don't understand how limited an infinitely good QC would be relative to a normal computer right now, based on the number of existing quantum algorithms that improve on classical algorithms.
Basically any numerical algorithm that's cyclical and parallelisable is better with quantum computation. Like, fast fourier transform (FFT) computations ,mainly used for large scale multiplications, can be sped up a huge amount with quantum processing. FFTs are one of if not the most used algorithms in the world.
I work in chemistry, and quantum computers are a genuine godsend for us. In 30 years when they are actually in a useful state. Right now pretty much all quantum calculations are done using estimations, and really those estimations are quite shit a lot of the time. Having useful quantum computers will open huge possibilities in large quantum simulations that just aren't possible right now. Quantum computers really would open the door to searching for unique electronic structures in molecules and by proxy will allow for superior medicines that target extremely specific body systems.
Great point, I uh, tend to forget about chemists sometimes. Same is true for quantum mechanics experiments of course. My worry is simply that the military will lock down any good application of quantum computers. They're never advertised for these sorts of things they'd actually be useful for.
The (dull) ray of hope if the military locks down on quantum computer tech is that its not too difficult for an academic to get time on these kind of machines with only the vaguest of justifications and connection to military goals. Just say something in the proposal about "laying the groundwork for future warfighter combat capacity" and "possible innovations in advanced weapon systems". As much as the military sucks they are good at putting money in to stuff that will only possibly help in the long run. There is little chance imo that internal military scientists would be able to fill enough time on a single quantum computer to the end that the military wouldn't open to academia.
Compromising our entire network security infrastructure for basically no gain besides making people money.
Certain non-convex optimization problems which would impact every industry and area of science. Most industries should be optimizing more but current solvers hit limits quickly or people are too lazy to use it.
I wonder why we were taught in school that Luddites were just lunatics that wanted to smash machines because they thought they were magic.
my jewish ass, knowing what they did last time with their new technology
:side-eye-1: :side-eye-2:
I'm completely blanking- can anyone explain it to me like a boomer?
(Is it just computers can operate anything really fast now?)
Quantum computers aren't general purpose computers, but rather they're really good at a handful of historically very difficult computing problems. You'll almost certainly never end up using a quantum computer yourself, but they can be used for things like logistics, encryption/decryption, etc that are more useful for nation states or larger entities.
is it just computers can operate anything really fast now?
Technically yes, but quantum mechanics prevent you from gaining any useful information from those operations any faster than a classical computer for most operations.
In practice it’s only faster in certain scenarios where a quantum algorithm can be used, ex: Shor’s algorithm for factoring the product of 2 prime numbers.
I'm not that well read on them. But basically it's only good at cracking passwords or whatever. Stuff like that.
The advantage is when you have to do a brute force calculation to guess a long number, the quantum bits can somehow check a ton of numbers simultaneously, because, like, quantum mechanics.
They are mostly just useful for very specific edge cases where normal computing just absolutely shits the bed. Quantum computing problems are mostly things that are normally handled by supercomputers today ie weather, chemistry, materials science.