Vue lecture

Microsoft, Atom Computing Leap Ahead On the Quantum Frontier With Logical Qubits

An anonymous reader quotes a report from GeekWire: Microsoft and Atom Computing say they've reached a new milestone in their effort to build fault-tolerant quantum computers that can show an advantage over classical computers. Microsoft says it will start delivering the computers' quantum capabilities to customers by the end of 2025, with availability via the Azure cloud service as well as through on-premises hardware. "Together, we are co-designing and building what we believe will be the world's most powerful quantum machine," Jason Zander, executive vice president at Microsoft, said in a LinkedIn posting. Like other players in the field, Microsoft's Azure Quantum team and Atom Computing aim to capitalize on the properties of quantum systems -- where quantum bits, also known as qubits, can process multiple values simultaneously. That's in contrast to classical systems, which typically process ones and zeros to solve algorithms. Microsoft has been working with Colorado-based Atom Computing on hardware that uses the nuclear spin properties of neutral ytterbium atoms to run quantum calculations. One of the big challenges is to create a system that can correct the errors that turn up during the calculations due to quantum noise. The solution typically involves knitting together "physical qubits" to produce an array of "logical qubits" that can correct themselves. In a paper posted to the ArXiv preprint server, members of the research team say they were able to connect 256 noisy neutral-atom qubits using Microsoft's qubit-virtualization system in such a way as to produce a system with 24 logical qubits. "This represents the highest number of entangled logical qubits on record," study co-author Krysta Svore, vice president of advanced quantum development for Microsoft Azure Quantum, said today in a blog posting. "Entanglement of the qubits is evidenced by their error rates being significantly below the 50% threshold for entanglement." Twenty of the system's logical qubits were used to perform successful computations based on the Bernstein-Vazirani algorithm, which is used as a benchmark for quantum calculations. "The logical qubits were able to produce a more accurate solution than the corresponding computation based on physical qubits," Svore said. "The ability to compute while detecting and correcting errors is a critical component to scaling to achieve scientific quantum advantage."

Read more of this story at Slashdot.

'El Capitan' Ranked Most Powerful Supercomputer In the World

Lawrence Livermore National Laboratory's "El Capitan" supercomputer is now ranked as the world's most powerful, exceeding a High-Performance Linpack (HPL) score of 1.742 exaflops on the latest Top500 list. Engadget reports: El Capitan is only the third "exascale" computer, meaning it can perform more than a quintillion calculations in a second. The other two, called Frontier and Aurora, claim the second and third place slots on the TOP500 now. Unsurprisingly, all of these massive machines live within government research facilities: El Capitan is housed at Lawrence Livermore National Laboratory; Frontier is at Oak Ridge National Laboratory; Argonne National Laboratory claims Aurora. [Cray Computing] had a hand in all three systems. El Capitan has more than 11 million combined CPU and GPU cores based on AMD 4th-gen EPYC processors. These 24-core processors are rated at 1.8GHz each and have AMD Instinct M1300A APUs. It's also relatively efficient, as such systems go, squeezing out an estimated 58.89 Gigaflops per watt. If you're wondering what El Capitan is built for, the answer is addressing nuclear stockpile safety, but it can also be used for nuclear counterterrorism.

Read more of this story at Slashdot.

With First Mechanical Qubit, Quantum Computing Goes Steampunk

An anonymous reader quotes a report from Science Magazine: Qubits, the strange devices at the heart of a quantum computer that can be set to 0, 1, or both at once, could hardly be more different from the mechanical clockwork used in the earliest computers. Today, most quantum computers rely on qubits made out of tiny circuits of superconducting metal, individual ions, photons, or other things. But now, physicists have made a working qubit from a tiny, moving machine, an advance that echoes back to the early 20th century when the first computers employed mechanical switches. "For many years, people were thinking it would be impossible to make a qubit from a mechanical system," says Adrian Bachtold, a condensed matter physicist at the Institute of Photonic Sciences who was not involved in the work, published today in Science. Stephan Durr, a quantum physicist at the Max Planck Institute for Quantum Optics, says the result "puts a new system on the map," which could be used in other experiments—and perhaps to probe the interface of quantum mechanics and gravity. [...] The new mechanical qubit is unlikely to run more mature competition off the field any time soon. Its fidelity -- a measure of how well experimenters can set the state they desire -- is just 60%, compared with greater than 99% for the best qubits. For that reason, "it's an advance in principle," Bachtold says. But Durr notes that a mechanical qubit might serve as a supersensitive probe of forces, such as gravity, that don't affect other qubits. And ETHZ researchers hope to take their demonstration a step further by using two mechanical qubits to perform simple logical operations. "That's what Igor is working on now," [says Yiwen Chu, a physicist at ETH Zurich]. If they succeed, the physical switches of the very first computers will have made a tiny comeback.

Read more of this story at Slashdot.

IBM Boosts the Amount of Computation You Can Get Done On Quantum Hardware

An anonymous reader quotes a report from Ars Technica: There's a general consensus that we won't be able to consistently perform sophisticated quantum calculations without the development of error-corrected quantum computing, which is unlikely to arrive until the end of the decade. It's still an open question, however, whether we could perform limited but useful calculations at an earlier point. IBM is one of the companies that's betting the answer is yes, and on Wednesday, it announced a series of developments aimed at making that possible. On their own, none of the changes being announced are revolutionary. But collectively, changes across the hardware and software stacks have produced much more efficient and less error-prone operations. The net result is a system that supports the most complicated calculations yet on IBM's hardware, leaving the company optimistic that its users will find some calculations where quantum hardware provides an advantage. [...] Wednesday's announcement was based on the introduction of the second version of its Heron processor, which has 133 qubits. That's still beyond the capability of simulations on classical computers, should it be able to operate with sufficiently low errors. IBM VP Jay Gambetta told Ars that Revision 2 of Heron focused on getting rid of what are called TLS (two-level system) errors. "If you see this sort of defect, which can be a dipole or just some electronic structure that is caught on the surface, that is what we believe is limiting the coherence of our devices," Gambetta said. This happens because the defects can resonate at a frequency that interacts with a nearby qubit, causing the qubit to drop out of the quantum state needed to participate in calculations (called a loss of coherence). By making small adjustments to the frequency that the qubits are operating at, it's possible to avoid these problems. This can be done when the Heron chip is being calibrated before it's opened for general use. Separately, the company has done a rewrite of the software that controls the system during operations. "After learning from the community, seeing how to run larger circuits, [we were able to] almost better define what it should be and rewrite the whole stack towards that," Gambetta said. The result is a dramatic speed-up. "Something that took 122 hours now is down to a couple of hours," he told Ars. Since people are paying for time on this hardware, that's good for customers now. However, it could also pay off in the longer run, as some errors can occur randomly, so less time spent on a calculation can mean fewer errors. Despite all those improvements, errors are still likely during any significant calculations. While it continues to work toward developing error-corrected qubits, IBM is focusing on what it calls error mitigation, which it first detailed last year. [...] The problem here is that using the function is computationally difficult, and the difficulty increases with the qubit count. So, while it's still easier to do error mitigation calculations than simulate the quantum computer's behavior on the same hardware, there's still the risk of it becoming computationally intractable. But IBM has also taken the time to optimize that, too. "They've got algorithmic improvements, and the method that uses tensor methods [now] uses the GPU," Gambetta told Ars. "So I think it's a combination of both."

Read more of this story at Slashdot.

Google Identifies Low Noise 'Phase Transition' In Its Quantum Processor

An anonymous reader quotes a report from Ars Technica: Back in 2019, Google made waves by claiming it had achieved what has been called "quantum supremacy" -- the ability of a quantum computer to perform operations that would take a wildly impractical amount of time to simulate on standard computing hardware. That claim proved to be controversial, in that the operations were little more than a benchmark that involved getting the quantum computer to behave like a quantum computer; separately, improved ideas about how to perform the simulation on a supercomputer cut the time required down significantly. But Google is back with a new exploration of the benchmark, described in a paper published in Nature on Wednesday. It uses the benchmark to identify what it calls a phase transition in the performance of its quantum processor and uses it to identify conditions where the processor can operate with low noise. Taking advantage of that, they again show that, even giving classical hardware every potential advantage, it would take a supercomputer a dozen years to simulate things.

Read more of this story at Slashdot.

IBM Opens Its Quantum-Computing Stack To Third Parties

An anonymous reader quotes a report from Ars Technica, written by John Timmer: [P]art of the software stack that companies are developing to control their quantum hardware includes software that converts abstract representations of quantum algorithms into the series of commands needed to execute them. IBM's version of this software is called Qiskit (although it was made open source and has since been adopted by other companies). Recently, IBM made a couple of announcements regarding Qiskit, both benchmarking it in comparison to other software stacks and opening it up to third-party modules. [...] Right now, the company is supporting six third-party Qiskit functions that break down into two categories. The first can be used as stand-alone applications and are focused on providing solutions to problems for users who have no expertise programming quantum computers. One calculates the ground-state energy of molecules, and the second performs optimizations. But the remainder are focused on letting users get more out of existing quantum hardware, which tends to be error prone. But some errors occur more often than others. These errors can be due to specific quirks of individual hardware qubits or simply because some specific operations are more error prone than others. These can be handled in two different ways. One is to design the circuit being executed to avoid the situations that are most likely to produce an error. The second is to examine the final state of the algorithm to assess whether errors likely occurred and adjust to compensate for any. And third parties are providing software that can handle both of these. One of those third parties is Q-CTRL, and we talked to its CEO, Michael Biercuk. "We build software that is really focused on everything from the lowest level of hardware manipulation, something that we call quantum firmware, up through compilation and strategies that help users map their problem onto what has to be executed on hardware," he told Ars. (Q-CTRL is also providing the optimization tool that's part of this Qiskit update.) "We're focused on suppressing errors everywhere that they can occur inside the processor," he continued. "That means the individual gate or logic operations, but it also means the execution of the circuit. There are some errors that only occur in the whole execution of a circuit as opposed to manipulating an individual quantum device." Biercuk said Q-CTRL's techniques are hardware agnostic and have been demonstrated on machines that use very different types of qubits, like trapped ions. While the sources of error on the different hardware may be distinct, the manifestations of those problems are often quite similar, making it easier for Q-CTRL's approach to work around the problems. Those work-arounds include things like altering the properties of the microwave pulses that perform operations on IBM's hardware, and replacing the portion of Qiskit that converts an algorithm to a series of gate operations. The software will also perform operations that suppress errors that can occur when qubits are left idle during the circuit execution. As a result of all these differences, he claimed that using Q-CTRL's software allows the execution of more complex algorithms than are possible via Qiskit's default compilation and execution. "We've shown, for instance, optimization with all 156 qubits on [an IBM] system, and importantly -- I want to emphasize this word -- successful optimization," Biercuk told Ars. "What it means is you run it and you get the right answer, as opposed to I ran it and I kind of got close."

Read more of this story at Slashdot.

❌