Quantum Computing. Melanie Swan
Чтение книги онлайн.
Читать онлайн книгу Quantum Computing - Melanie Swan страница 19
3.1Introduction
Quantum computers are in the early stages of development and would likely be complementary to existing computational infrastructure, interacting with classical devices, and being accessed either locally or as a cloud service. Currently, the top methods demonstrate 30–70 qubits of processing power and achieve fidelity rates above 99% (i.e. below a fault tolerance threshold of 1%). However, there is uncertainty about the realizability of scalable universal quantum computers. Quantum computers may excel at solving certain types of problems such as optimization. This could offer a step-up in computing such that it is possible to solve new classes of problems, but not all problems. For example, considering well-known optimization problems, it may be possible to search twice as many possibilities in half the time (exploring a fixed-size space in the square root of the amount of time required for a classical computer).
Quantum computing is an early-stage technology with numerous risks and limitations (Dyakonov, 2018). The long-term goal of universal quantum computing is not immediate as many challenges including error correction need to be resolved. In the short term, the focus is on solving simple problems in which quantum computers offer an advantage over classical methods through NISQ devices (noisy intermediate-scale quantum devices) (Preskill, 2018).
3.1.1 Breaking RSA encryption
One of the biggest questions is when it might be possible to break existing cryptography standards with quantum computing. The current standard is 2048-bit RSA (Rivest–Shamir–Adleman) encryption, which is widely used for activities such as securely sending credit card details over the internet. Predictions vary as to when it may be possible to break the current standard, meaning factor the 2048-bit integers used by the RSA method. Although not immanent, methods are constantly improving, and readying cryptographic systems for the quantum era is prescribed.
A 2019 report published by the US National Academies of Sciences predicts that breaking RSA encryption is unlikely within the next decade. The report indicates that any serious applications of quantum computing are at least 10 years away (Grumbling & Horowitz, 2019). Given the current state of quantum computing and the recent rates of progress, “it is highly unexpected that a quantum computer that can compromise RSA 2048 or comparable discrete logarithm-based public key cryptosystems will be built within the next decade” (Grumbling & Horowitz, 2019, 157). The report’s stance on error correction is that “The average error rate of qubits in today’s larger devices would need to be reduced by a factor of 10 to 100 before a computation could be robust enough to support [the required] error correction at scale” (Grumbling & Horowitz, 2019). The report further highlights that “at this error rate, the number of physical qubits held by these devices would need to increase at least by a factor of 105 in order to create a useful number of effective logical qubits” (Grumbling & Horowitz, 2019). A 2016 National Institute of Standards and Technology (NIST) report has a similar result, noting that some of the most aggressive time estimates predicting when quantum computers might be powerful enough to break 2048-bit RSA might be by 2030, at a potential cost of a billion dollars (Chen et al., 2016, 6).
One complication in making predictions is that the number of required qubits (processing power) needed to factor 2048-bit RSA integers varies by method. Different algorithms need different numbers of qubits (Mavroeidis et al., 2018). Although difficult to guess, “current estimates range from tens of millions to a billion physical qubits” (Mosca, 2018, 39). Newer estimates propose more granularity, indicating in more detail how a quantum computer might perform the calculation with 20 million noisy qubits (without error correction) in just 8 hours (Gidney & Ekera, 2019). (The method relies on modular exponentiation which is the most computationally expensive operation in Shor’s algorithm for factoring.) In 2012, a 4-qubit quantum computer factored the number 143, and in 2014, a similar device factored the number 56,153. However, scaling up quickly is not straightforward because the extent of error correction required is unknown. The recent result from Gidney & Ekera (2019) suggesting that 20 million qubits might be possible without error correction is potentially a landmark step.
One of the nearest term remedies for post-quantum security is quantum cryptography, in the form of quantum key distribution (QKD), which has been foreseen in quantum information science roadmaps for some time (Los Alamos National Laboratory (LANL), 2004). QKD is the idea of issuing cryptographic keys generated with quantum computing methods and distributing them via global communications networks, satellite-based and terrestrial. QKD has been experimentally demonstrated and remains to be commercialized. The market for QKD is estimated to reach $980 million by 2024 from $85 million in 2019 (Gasman, 2019).
3.2Basic Concepts: Bit and Qubit
Quantum computing springs from Nobel physicist Richard Feynman’s intuition that it should be possible to perform very powerful computations by using quantum building blocks. He suggests the idea of simulating physics with computers using a universal quantum simulator.
Feynman worked on many problems at the small scales of the quantum mechanical domain. He famously announced that “There’s plenty of room at the bottom” (Feynman, 1960), referring to the idea of miniaturization at the atomic scale such that the entire 24-volume set of the Encyclopedia Britannica might be printed on the head of a pin. These ideas helped to sponsor the nanotechnology industry in the 1990s and are likewise motivating the current development of quantum computing. To get an idea of scale, the nanometer scale is 10−9 m, the atomic scale (in terms of the Bohr radius being the probable distance from the nucleus to the electron) is 10−11 m, and the electron scale (the size of the electron) is 10−15 m.
Feynman suggested that a quantum computer could be an efficient universal simulator of quantum mechanics. Such a “universal quantum simulator” (Feynman, 1982, 474) would be a different kind of computer that is not a traditional Turing machine. He posits two ways to simulate quantum mechanics with computers. One is reconceiving the notion of computers and building computers out of quantum mechanical elements that obey quantum mechanical laws. The other idea is trying to imitate quantum mechanical systems with classical systems.
Feynman’s key thought is that the more closely computing systems can be built in the structure of nature, the better they can simulate nature. He says that the “the various field theories have the same kind of behavior, and can be simulated in every way, apparently, with little latticeworks of spins and other things” (Feynman, 1982, 474–5). Assuming that the world is made in a discrete lattice, “the phenomena of field theories can be well imitated by many phenomena in solid state theory (which is simply the analysis of a latticework of crystal atoms, and in the case of the kind of solid state I mean each atom is just a point which has numbers associated with it, with quantum mechanical rules)” (Feynman, 1982, 475).
Feynman proposed the idea of the universal quantum simulator in 1982, following which there have been other theoretical developments. In 1985, David Deutsch proposed a universal quantum computer based on the idea that quantum gates could function in a similar fashion to traditional digital computing binary logic gates [Deutsch, 1985]. In 2000, Charlie Bennett showed that it is possible to efficiently simulate any classical computation using a quantum computer (Bennett & DiVincenzo,