
Amazon Web Services is the third of the Big Three cloud computing giants to announce a quantum computing chip breakthrough this week, joining Microsoft last week and Google in December.
Amazon’s researchers claim their chip, Ocelot, can be much more efficient in using qubits, the fundamental building block of quantum computing, in an approach akin to traditional chip transistors.
In a technical research article in this week’s Nature magazine, lead author Harald Putterman of Amazon’s AWS Center for Quantum Computing in Pasadena, Calif., and colleagues explain how the use of analog circuits, rather than digital ones and zeros, can drastically reduce the number of physical qubits needed for a working quantum chip.
Also: Microsoft’s quantum chip Majorana 1 is a few qubits short
A blog post from Amazon explains more details about the effort in simpler language than the Nature paper.
The chip follows on the announcement last week by Microsoft of its Majorana 1 chip, and Google’s announcement in December of its Willow chip.
All three teams confront the central problem for quantum chip efforts: how to group together enough physical qubits so that their individual errors average out to produce a reliable, verifiable “logical qubit” that can be used for computing operations such as adding numbers.
The difference between the three is in which kinds of physical qubits they choose as the most efficient to achieve the logical qubit. Google’s Willow is a fairly standard superconducting material, Microsoft’s Majorana 1 uses exotic majarona particles that are a matter-anti-matter combination.
Putterman and team purport to show that their approach is better than the others using what are called cat qubits.
“Ocelot represents our first chip with the cat qubit architecture and an initial test of its suitability as a fundamental building block for implementing quantum error correction,” relates the AWS blog post.
Also: Google’s quantum breakthrough is ‘truly remarkable’ – but there’s more to do
A cat qubit, named after the famous house cat — aka, Schrödinger’s cat — in quantum experiments that is either alive or dead inside a box, is an analog qubit rather than a digital qubit that makes up most quantum chips. An analog qubit is not counted, as with digital ones and zeros, but measured as a continuous value like a wave. In this case, the value is the amplitude as a collection of photons trapped in a light-shaping waveguide.
An ocelot is a wild cat native to the southwestern US and South America, so the chip’s name is a nice play on words about house cats and cat qubits.
Scientists such as John Preskill of UCLA explored analog computing in quantum over 20 years ago as a more efficient means of making a qubit. The Ocelot team bases its approach on experiments described in 2019 by a team of researchers at Inria, France’s National Institute for Research in Digital Science and Technology.
Also: If you’re not working on quantum-safe encryption now, it’s already too late
Putterman and team contend that the cat qubits’ measurement is able to achieve the quantum error correction “with less than a fifth as many qubits — five data qubits and four ancilla qubits, versus 49 qubits for a surface code device [that is, a traditional digital qubit device].”
A more-efficient qubit is analogous to how silicon transistors revolutionized the computer chip, AWS asserts. “The history of computing shows that scaling the right component can have massive consequences for cost, performance, and even feasibility,” the blog post states. “The computer revolution truly took off when the transistor replaced the vacuum tube as the fundamental building block to scale.”
The choice to use analog rather than digital is part of a parallel field in electronics focused on extracting the benefits of analog computing instead of digital. By measuring, rather than counting, analog has advantages over digital chips, such as manipulating variables that would be laborious to count and handling variables that can have fuzzy values.
Also: For Turing Award winner, everything is computation and some problems are unsolvable
In its present version, the Ocelot chip contains only five of the so-called cat qubits, not enough to perform actual logical operations but enough to store a quantum bit of information.
Despite the challenges of analog, Amazon AWS’s researchers — like Google’s and Microsoft’s — believe they have the right qubit to make enough qubits on a chip to someday assemble a working computer.
“We believe that Ocelot’s architecture, with its hardware-efficient approach to error correction, positions us well to tackle the next phase of quantum computing: learning how to scale,” the blog post states. “Using a hardware-efficient approach will allow us to more quickly and cost-effectively achieve an error-corrected quantum computer that benefits society.”