Microsoft’s Quantum Computing Breakthrough, Explained
Microsoft recently announced a potential breakthrough in quantum computing, claiming to have made significant strides in the development of quantum processors. The company’s research team says they’ve built “the world’s first quantum processor powered by topological qubits” and are aiming to develop a scalable quantum computer prototype “in years, not decades.”
The announcement, though unproven, has the potential to change the field of computers.
If the claim is validated, the advance could represent a pivotal point in the field. Microsoft also announced the creation of “a new state of matter” during the process: a topoconductor. Additionally, the U.S. government’s Defense Advanced Research Projects Agency (DARPA) has selected Microsoft, along with quantum computing company PsiQuantum, to explore unconventional approaches to quantum computing, a field blending computer science, theoretical physics, and mathematics.
The Microsoft chip, called Majorana 1, is named after the Italian physicist whose theories provided the basis for the chip’s design. The chip is the latest in a series of projects by IBM, Google, Nokia Bell Labs, and others working to develop the fundamental mechanisms needed for quantum computers. Quantum computing, a field built upon the principles of quantum mechanics, demands precision engineering at the subatomic level, making it a remarkably challenging endeavor.
The potential advantages of developing computers that surpass the capabilities of today’s classical computers are significant. Since the computing revolution of the late 1940s, the core operating system of computers has remained largely unchanged. Computer chips continue to store, manipulate, and execute instructions using bits, the 0/1 binary logic used by all computers.
Scott Aaronson, a professor of theoretical computer science at the University of Texas at Austin, noted that today’s machines are “millions of times faster with millions of times more memory.”
This progress has followed semiconductor pioneer Gordon Moore’s projection, known as Moore’s Law, which states that chip density—and, by extension, computing speed—should double about every two years. Moore believed his law would be valid at least through 1985, but it has largely continued to the present day. Data from the Center for Strategic and International Studies shows that “the computing power of a single integrated circuit today is roughly two billion times what it was in 1960.”
However, classical computation has its limits. Richard Feynman, a physicist, is believed to be among the first to articulate one of these limitations in the early 1980s. He conducted research in quantum thermodynamics, electron-photon interactions, and liquid helium, ultimately realizing that the complex equations in these fields could not be efficiently solved by classical computers. This is primarily due to scale; quantum computing demands a massive number of inputs that a classical computer can never efficiently manage, no matter how fast it may be.
Feynman, and later others, saw that quantum mechanics could offer a way out of the problem, suggesting the possibility of creating a processor that operated according to the laws of quantum mechanics.
Unlike classical computers that use bits with values of 0 or 1, quantum computers use quantum bits (qubits). These behave like electrons, photons, and other quantum particles. A qubit can exist in multiple states at once (called superpositions), capable of being in any position in between 0 and 1. Moreover, qubits can be linked, so a change in one can affect another. According to MIT Technology Lab, a group of interconnected qubits “can provide way more processing power than the same number of binary bits.”
Chetan Nayak, technical fellow and corporate vice president of quantum hardware at Microsoft, has commented that quantum computers are not simply faster versions of classical computers, but they are “an entirely different modality of computing.”
Furthermore, contrary to a common misconception, Aaronson clarifies that quantum computers do not merely try “every possible answer in parallel.” Instead, quantum computing is a “choreograph” of possible states for each qubit, where “paths leading to [the] wrong answer … cancel each other out” while “paths leading to the right answer should reinforce each other.”
This could usher in an era of problem-solving that has been envisioned by the first pioneers of the field, including Feynman. In Aaronson’s view, quantum computing’s biggest benefit is “simply the simulation of quantum mechanics itself … to learn about chemical reactions … design new chemical processes, new materials, new drugs, new solar cells, new superconductors.” DARPA anticipates significant advances if quantum computing becomes a reality, including “faster automation, improved target recognition, and more precise, lethal weapons,” as well as more robust cybersecurity, according to Defense News.
However, qubits’ quantum states are delicate, and disturbances — ranging from changes in light or temperature to vibrations — can modify their superpositions, causing errors. This “decoherence” of qubits has presented a major hurdle in quantum computing research and development. Today’s quantum processors, as a result, are often full of noise and errors.
Microsoft’s approach could provide a unique advantage over its competitors. Topological qubits are manufactured to inherently possess quantum stability. But, topological qubits are challenging to build, and their results are challenging to measure. Nayak has claimed Microsoft’s qubit project is “the longest-running R&D program in Microsoft history.” As of last September, IBM and Google had built 127- and 72-qubit processors, respectively, whereas Microsoft had built an 8-qubit processor. If Nayak’s team succeeds in producing a more stable processor at scale, that may be one of the most significant advances in quantum computing since the invention of the transistor in classical computing.
The main question now is whether Microsoft has actually built such a processor. Nature released the company’s independently reviewed paper on the same day, but it did not offer proof of the claim that it had built a topological qubit. Microsoft issued a press release and met with several hundred other researchers to consider the Nature paper and reveal the team’s progress since the paper’s submission earlier the previous year.
The press release was triumphant: “Majorana 1: the world’s first Quantum Processing Unit (QPU) powered by a Topological Core, designed to scale to a million qubits on a single chip… With the core building blocks now demonstrated … we’re ready to move from physics breakthrough to practical implementation.” It certainly seemed like Microsoft had built a topoconductor.
Aaronson wrote on his blog, Shtetl-Optimized, “Microsoft is unambiguously claiming to have created a topological qubit, and they just published a relevant paper in Nature, but their claim to have created a topological qubit has not yet been accepted by peer review.”
Henry Legg, a physicist at the University of St. Andrews, was more direct: “The optimism is definitely there, but the science isn’t there.” This may have been a bit premature on Microsoft’s part. However, Aaronson hasn’t been this enthusiastic about quantum computing in more than 25 years. “This past year or two is the first time I’ve felt like the race to build a scalable fault-tolerant quantum computer is actually underway.”