Computers that exploit the strange rules of quantum mechanics could soon solve problems that existing technology cannot solve. Today’s machines are still far from achieving this goal, but the field of quantum computing has made tremendous progress since its inception.
Quantum computing has gone from an academic curiosity to a multibillion-dollar industry in less than half a century and shows no signs of stopping. Here are the 12 most important stages of that journey.
1980: The quantum computer is born
By the 1970s, scientists had begun to think about potential crossovers between the new fields of quantum mechanics and information theory. But he was an American physicist Paul Benioff who crystallized many of these ideas when he published the very first one description of a quantum computer. He proposed a quantum version of a “Turing machine” – a theoretical computer model, devised by the famous British computer scientist Alan Turing, capable of implementing any algorithm. By demonstrating that such a device could be described using the equations of quantum mechanics, Benioff laid the foundation for the new field of quantum computing.
1981: Richard Feynman popularizes quantum computing
Both Benioff and the legendary physicist Richard Feynmann he first gave talks on quantum computing Conference on the Physics of Computing in 1981. Feynman main speech was on the topic of using computers to simulate physics. He pointed out that since the physical world is quantum in nature, simulating it exactly requires computers that operate similarly based on the rules of quantum mechanics. He introduced the concept of “quantum simulator”, which cannot implement any program like a Turing machine, but can be used to simulate quantum mechanical phenomena. The talk is often credited with reviving interest in quantum computing as a discipline.
1985: The “universal quantum computer”
One of the fundamental concepts of computer science is the idea of the universal Turing machine. Introduced by its namesake in 1936, it is a particular type of Turing machine capable of simulating the behavior of any other Turing machine, allowing it to solve any computable problem. However, David Deutscha professor of quantum theory of computation, pointed out in an article from 1985 that, since the universal computer described by Turing was based on classical physics, it would not be able to simulate a quantum computer. He reformulated Turing’s work using quantum mechanics to devise a “universal quantum computer”, capable of simulating any physical process.
1994: First killer use case for quantum computers
Despite the theoretical promise of quantum computers, researchers had yet to find clear practical applications for the technology. American mathematician Peter Shor he became the first to do so when he introduced a quantum algorithm that could efficiently factor large numbers. Factorization is the process of finding the smallest set of numbers that can be combined to create a larger one. This process becomes increasingly difficult for large numbers and forms the basis for many main encryption schemes. However, Shor’s algorithm can solve these problems exponentially faster than classical computers, raising fears that quantum computers could be used to break modern cryptography and spurring the development of post-quantum cryptography.
1996: Quantum computing faces research
It didn’t take long for another promising application to appear. Computer scientist at Bell Labs Lov Grover proposed a quantum algorithm for unstructured search, which refers to searching for information in databases without an obvious organizing system. It’s like looking for the proverbial needle in a haystack and is a common problem in computer science, but even the best classical search algorithms can be slow when faced with large amounts of data. The Grover algorithm, as it has become known, exploits the quantum phenomenon of superposition to dramatically speed up the search process.
1998: First demonstration of a quantum algorithm
Imagining quantum algorithms on a blackboard is one thing, but actually implementing them on hardware has proven to be much more difficult. In 1998, a team led by an IBM researcher Isaac Chuang they made a breakthrough when they he showed that they could run Grover’s algorithm on a computer equipped with two qubits, the quantum equivalent of bits. Just three years later Chuang also led the first implementation of Shor’s algorithm on quantum hardware, factoring the number 15 using a seven-qubit processor.
1999: The superconducting quantum computer is born
The fundamental elements of a quantum computer, known as qubitsit can be implemented across a wide range of different physical systems. But in 1999, physicists at the Japanese technology company NEC found an approach that would become the most popular approach to quantum computing today. In the paper in Naturethey have demonstrated that they can use superconducting circuits to create qubits and that they can control them electronically. Superconducting qubits are now used by many major quantum computing companies, including Google and IBM.
2011: Release of the first commercial quantum computer
Despite considerable progress, quantum computing was still primarily an academic discipline. THE launch of the first commercially available quantum computer by the Canadian company D-Wave in May 2011 marked the beginning of the quantum computing industry. The start-up’s D-Wave One was equipped with 128 superconducting qubits and cost about $10 million. However, the device was not a universal quantum computer. It used an approach known as quantum annealing to solve a specific type of optimization problem, and there was little evidence that it provided a speedup over classical approaches.
2016: IBM makes quantum computing available on the cloud
While several large tech companies were developing universal quantum computers internally, most academics and aspiring quantum developers had no way to experiment with the technology. In May 2016, IBM made its five-qubit processor available on the cloud for the first time, allowing people outside the company to run quantum computing work on its hardware. Within two weeks, more than 17,000 people signed up for the company’s IBM Quantum Experience service, giving many their first hands-on experience with a quantum computer.
2019: Google claims ‘quantum supremacy’
Despite theoretical promises of massive “speedup,” no one had yet demonstrated that a quantum processor could solve a problem faster than a classical computer. But in September 2019, news emerged that Google had used 53 qubits to perform a calculation in 200 seconds that, it said, would have required a supercomputers about 10,000 years to complete. The problem at hand had no practical use: Google’s processor simply performed random operations, and then the researchers calculated how long it would take to simulate them on a classical computer. But the achievement was hailed as the first example of “quantum supremacy,” now more commonly referred to as “quantum advantage.”
2022: A classic algorithm disproves claims of supremacy
Google’s claim of quantum supremacy was met with skepticism from some corners, particularly from arch-rival IBM, which said the acceleration was overrated. A team from the Chinese Academy of Sciences and other institutions eventually proved this to be the case, designing a classical algorithm which could simulate Google’s quantum operations in just 15 hours on 512 GPU chips. They claimed that by having access to one of the world’s largest supercomputers, they could do it in seconds. The news reminded us that classical computing still has plenty of room for improvement, so quantum advantage is likely to remain a moving target.
2023: QuEra breaks record for most logical qubits
One of the biggest obstacles for quantum computers today is that the underlying hardware is highly error-prone. Due to the peculiarities of quantum mechanics, correcting these errors is complicated, and it has long been known that many physical qubits will be needed to create so-called “logical qubits” that are immune to errors and capable of performing operations reliably. Last December, Harvard researchers working with the start-up QuEra broke records by generating 48 logical qubits at once, 10 times more than anyone had ever achieved before. The team managed to run algorithms on these logical qubits, marking an important milestone on the road to fault-tolerant quantum computing.
#History #quantum #computing #key #moments #shaped #future #computers