laitimes

The prospect of practical quantum information processing丨 exhibition volume

The prospect of practical quantum information processing丨 exhibition volume

Revisiting the prospects for practical quantum information processing

Author 丨Yin Zhangqi (Professor, Quantum Technology Research Center, School of Physics, Beijing Institute of Technology)

In 2000, Michael A. Nielsen (then Senior Fellow, Institute for Theoretical Physics, Circumference, Canada) and Isaac L. Chuang (professor in the Department of Physics and Electrical Engineering at the Massachusetts Institute of Technology in the United States) devoted a small section to the first chapter of "The Prospect of Practical Quantum Information Processing" (see below) when writing Quantum Computing and Quantum Information, which is now a classic textbook. In 2003, shortly after I started graduate school, I read this book. At that time, I carefully read the first three chapters and made the physical implementation of quantum information processing my main research direction. Nearly 20 years later, I participated in the translation of this book, re-read these statements, and I have to feel that their vision is indeed first-rate: although some of the ideas are slightly outdated, the practical process of quantum information processing in the past 20 years or so is generally in line with their predictions.

The prospect of practical quantum information processing丨 exhibition volume

This book is one of the most cited books in the field of quantum information and physics, and many universities around the world use this book as a textbook for quantum computing courses, suitable for learners interested in quantum computing and quantum information.

In 2000, the biggest bottleneck in the theory of quantum information processing, the noise problem, had just been solved, and quantum error correction codes and the threshold theorem for fault-tolerant quantum computing were created. So Nielsen & Chuang emphasized the importance of this theory at the outset: if we can reduce the noise level of quantum computing below a certain "threshold", we can further reduce the error rate through quantum error correction and achieve reliable quantum computers. The core goal of the development of quantum information processing technology is to reduce the error rate until the threshold is exceeded, and to achieve fault-tolerant quantum computing and quantum information processing.

Over the past 20 years, significant advances have been made in fault-tolerant quantum computing, both theoretically and experimentally. In 2000, error-tolerant quantum computing error thresholds based on stable subcoding were generally considered to be on the order of 10,000ths. By around 2010, people proposed surface codes based on topology theory, with a threshold of about 1%, which was increased by two orders of magnitude. With the sharp increase in the fault tolerance threshold, the prospect of practical application of quantum computing was suddenly opened up, and it gained more and more attention from the industry. Of course, this also requires a certain price, the use of surface code error correction, the need to consume more physical resources, about 1000 physical qubits, can achieve a fault-tolerant logic qubits. It is estimated that to achieve a practical quantum computer that surpasses classical computing and can correct errors, it will take about 1,000 logical qubits, and the number of physical qubits required is a million. This is a huge challenge in engineering! In recent years, in order to make the error correction theory and experimental technology better match, people have also proposed many new ideas, such as quantum error mitigation.

On the other hand, significant progress has also been made in the control technology of quantum systems. At the time of Nielsen & Chuang's book, the most promising quantum information processing systems were single photons, ion traps, and nuclear magnetic resonance systems. In recent years, optical quantum information processing has developed rapidly, and quantum cryptography has gradually matured and become practical. Quantum network technology based on quantum teleportation is also booming. In addition, photon-based quantum computing technology has also made great progress. In 2020, the research group of the University of Science and Technology of China realized the quantum superiority experiment based on the photonic system. At the same time, the combination of integrated optics and optical quantum computing has also given people confidence in the prospects of large-scale, programmable optical quantum computing.

After 2010, quantum computing based on superconducting circuit qubits has emerged and has become the most promising candidate system for quantum computing, along with ion trap quantum computing and optical quantum computing. Quantum computers based on superconducting circuits and ion traps have reduced the error rate of their quantum logic gates to less than 1%, reaching the fault-tolerant quantum computing threshold of surface codes. Not only that, but the number of qubits in quantum computers based on these two physical systems is also growing rapidly, and both are currently approaching 100. Quantum computing based on superconducting circuits has also achieved quantum superiority experiments in 2019. It is on this basis that many large companies such as Google, IBM, etc. are confident that people can demonstrate fault-tolerant quantum logic gates for the first time around 2025 and achieve practical fault-tolerant quantum computing after 2030. At present, from industry to academia, people are studying how to use mesoscale noise-containing quantum computers that achieve quantum superiority to solve truly valuable problems and play their unique role.

In addition, quantum computing based on silicon-based semiconductor quantum dots, although the number of qubits is not much for the time being, the error rate of quantum logic gates has recently been reduced to below the fault tolerance threshold of 1%. Traditional computer companies such as Intel, which have a monopoly position in the field of silicon-based computing chips, have a fondness for silicon-based quantum dot technology. If this technology path can make full use of the accumulation of traditional semiconductor micromachining technology, the future prospects can be expected.

Finally, with the development of technology, it has gradually been confirmed that nuclear magnetic resonance systems are difficult to achieve large-scale quantum information processing, and the number of qubits is difficult to exceed 10. However, this technology has high control accuracy, relatively low cost, and can work at room temperature and pressure, so it has become a weapon for quantum simulation of small-scale systems and shines in the teaching of quantum information technology.

Yin Zhangqi is one of the translators of the book, and the other translators are Sun Xiaoming, Shang Yun, Li Luzhou, Wei Zhaohui, and Tian Guojing.

The prospect of practical quantum information processing

撰文 | Michael A. Nielsen, Isaac L. Chuang

Building quantum information processing devices is a huge challenge for scientists and engineers in the third millennium. Can we rise to the challenge? Is it possible? Is it worth trying? If it is worth it, how will this feat be achieved? These are difficult but important questions that we will answer briefly in this section and extend throughout the book.

The most basic question is, is there a principle that prohibits us from processing quantum information in one or more forms? There are two possible obstacles: noise may pose a fundamental obstacle to useful quantum information processing; or quantum mechanics may be incorrect.

There is no doubt that noise is a major obstacle to the development of practical quantum information processing devices. Is this a fundamentally unresolvable obstacle? Will the development of large-scale quantum information processing devices be hindered forever? The theory of quantum error correction codes strongly shows that while quantum noise is a practical problem that needs to be solved, there is no fundamental principle problem. In particular, there is a threshold theorem for quantum computing that, roughly speaking, suggests that if the noise level in a quantum computer can be reduced below a constant "threshold", then quantum error correction codes can be used to further reduce noise, requiring only a small overhead of computational complexity, which can basically be reduced to any small. The threshold theorem makes some broad assumptions about the nature and magnitude of the noise that occurs in quantum computers, and the architectures that can be used to perform quantum computing; however, if these assumptions are met, the effect on the noise of quantum information processing is largely negligible. Chapters 8, 10, and 12 discuss quantum noise, quantum error correction, and threshold theorems in detail.

The second possibility that hinders the processing of quantum information is that quantum mechanics is incorrect. In fact, exploring the validity of quantum mechanics (relativistic and non-relativistic) is one of the reasons for the interest in building quantum information processing devices. We have never explored before the system of nature that gains complete control in large-scale quantum systems, where perhaps nature may reveal some new surprises that quantum mechanics does not adequately explain. If this happens, it will be a major discovery in the history of science and is expected to have a significant impact in other disciplines and technologies like the discovery of quantum mechanics. Such discoveries could also affect quantum computing and quantum information; however, whether or not such an impact will enhance, attenuate, or not affect the ability to process quantum information cannot be predicted in advance. Unless these effects are discovered, we have no way of knowing how they will affect information processing, so in the rest of this book we will consider all the evidence to date and assume that quantum mechanics is a correct and complete description of the world.

Since there are no fundamental obstacles to building quantum information processing devices, why do we have to invest a lot of time and money in doing so? We've discussed several reasons to do this: practical applications such as quantum cryptography and prime factorization of large composite numbers; and a desire to gain fundamental insights into nature and information processing.

These are all good reasons and justify investing a lot of time and money in building quantum information processing devices. But in order to assess their relative merits, a clearer understanding of the relative capabilities of quantum and classical information processing is needed. To do this, further theoretical work on the basis of quantum computing and quantum information is needed. Of particular interest is the question "Are quantum computers more powerful than classical computers?" "The decisive answer to this question. Even if we can't answer this question for the time being, it would be useful to give a clear and interesting application path in different complexity situations to help researchers experimentally implement quantum information processing. Historically, technological advances have tended to be accelerated through the use of short- and medium-term incentives as stepping stones to achieve long-term goals. Microprocessors, for example, were originally used as controllers for elevators and other simple devices before they eventually became the basic components of personal computers (no one knew what that was before). Below we outline a short- to medium-term target path for those interested in achieving the long-term goal of large-scale quantum information processing.

1

Surprisingly, many small-scale applications of quantum computing and quantum information are known. Not all of them are as flashy as quantum factoring algorithms, but implementing small-scale applications is relatively easy, making it important to make it a medium-term goal.

Quantum state tomography and quantum process tomography are two fundamental processes whose perfection is important for quantum computing and quantum information, and they have their own independent value. Quantum state layer analysis is a method of determining the quantum state of a system. To do this, it must overcome the "hidden" nature of quantum states — remember, quantum states can't be directly determined by a single measurement — by repeatedly preparing the same quantum state and then measuring it differently to establish a complete description of the quantum state. Quantum process tomography is a more ambitious (but closely related) process designed to fully characterize the dynamics of quantum systems. For example, quantum process tomography can be used to characterize the performance of so-called quantum gates or quantum communication channels, or to determine the types and sizes of different noisy processes in a system. In addition to the obvious applications in quantum computing and quantum information, as a diagnostic tool, for quantum effects important in scientific and technological fields, quantum process tomography can be expected to assist in evaluating and improving any of the fundamental operations in it. Quantum state tomography and quantum process tomography are described in more detail in Chapter 8.

Various small-scale communication primitives are also very interesting. We have already mentioned quantum encryption and quantum teleportation. The former may be useful in practical applications, and it involves the distribution of a small amount of critical material that requires a high degree of safety. The use of quantum teleportation may remain to be solved. As we will see in Chapter 12, in the presence of noise, teleportation can be a very useful primitive for transferring quantum state between remote nodes in a network. The idea is to concentrate on distributing EPR pairs among nodes that want to communicate. EPR pairs can be damaged during communication, however, special "entanglement distillation" protocols can be used to "purify" EPR pairs, enabling them to be used to transfer quantum states from one location to another. In fact, protocols based on entanglement distillation and long-distance transmission are superior to more conventional quantum error correction techniques in enabling noise-free communication of qubits.

What about medium-sized? One promising medium-scale quantum information processing application is the simulation of quantum systems.

In order to simulate a quantum system containing even a few dozen "qubits" (or some other basic system equivalent), even the resources of a state-of-the-art supercomputer are not enough. Some simple calculations give a guiding explanation. Suppose we have a system of 50 qubits, and to describe the state of such a system requires 2^50 ≈ 10^15 complex amplitudes. If the amplitude is stored to 128-bit precision, then it needs 256 bits or 32 bytes to store each amplitude, for a total of 32×10^15 bytes of information, or about 32,000 Tbytes of information, far more than the capacity of existing computers, and, if it is assumed that Moore's Law has always been true, then it is equivalent to the storage capacity of supercomputers expected to appear in the second decade of the 21st century. 90 qubits at the same level of precision require 32× 10^27 bytes, and kilograms (or more) of matter even if a single atom is used to represent one bit.

How useful are quantum simulations? It seems that traditional methods can still be used to determine the basic properties of a material, such as bond strength and basic spectral properties. However, once the fundamental properties are well understood, quantum simulations are likely to be useful as a tool for laboratories to design and test the properties of new molecules. In a traditional laboratory setup, many different types of "hardware" — chemicals, testing equipment, etc. — may be required to test the various possible designs of molecules. On a quantum computer, these different types of hardware can all be simulated with software, which can be cheaper and faster. Of course, the final design and testing must be done on real physical systems; however, quantum computers can explore a wider range of potential designs and evaluate better final designs. It is worth noting that this method of ab initio computation to assist in the design of new molecules has been tried on classical computers; however, due to the enormous computational resources required to simulate quantum mechanics on classical computers, only limited success has been achieved. Quantum computers should be able to do better in the near future.

What are the large-scale applications? In addition to applications such as extended quantum simulation and quantum cryptography, there are relatively few well-known large-scale applications: large-number factorization, computational discrete logarithms, and quantum search. Interest in the first two issues comes mainly from their negative impact on limiting the viability of existing public key cryptography systems. (For mathematicians interested in both of these problems, they may also have substantial practical significance out of their own interests alone.) So it seems unlikely that the nucleolator factor and discrete logarithm will always be important applications in the long run. Quantum search can have huge utilities due to the widespread use of heuristic search, and we'll discuss some of the possible applications in Chapter 6. What is truly extraordinary may be the more large-scale application of quantum information processing. This is a great goal for the future!

2

Assuming that there are potential applications for quantum information processing, how can it be implemented in real physical systems? On a small scale of a few qubits there are already several implementations of quantum information processing devices. Perhaps the easiest thing to achieve is based on optical technology, namely electromagnetic radiation. Simple devices such as mirrors and beamsplitters can be used for basic operations on photons. Interestingly, a major difficulty is generating single photons as needed; experimental physicists instead choose to use a scheme that randomly generates single photons "from time to time" and wait for this event to occur. Using this optical technique, quantum cryptography, ultra-dense coding, and quantum teleportation have been realized. The main advantage of optical technology is that photons are often highly stable carriers of quantum mechanical information. Its main disadvantage is that photons do not interact directly. Instead, the interaction must be regulated by other media, such as atoms, which introduce additional noise and complexity to the experiment. Establishing an effective interaction between two photons essentially works in two steps: the first photon interacts with the atom, which in turn interacts with the second photon, resulting in an overall interaction between the two photons.

Another scheme is based on methods of captivating different types of atoms: including an ion trap, in which a small number of charged atoms are imprisoned in confined space; and a neutral atomic trap, which is used to imprison uncharged atoms in confined space. Quantum information processing schemes based on atomic traps use atoms to store qubits. Electromagnetic radiation also appears in these schemes, but in a completely different way than the "optical" approach we call quantum information processing. In these schemes, photons are used to manipulate information stored in the atoms themselves, rather than as carriers of stored information. Single-qubit gates can be performed by applying appropriate pulses of electromagnetic radiation on individual atoms. Adjacent atoms can achieve quantum gates through, for example, dipole force interactions. In addition, the exact nature of the interaction between adjacent atoms can be modified by applying the appropriate pulse of electromagnetic radiation to the atom, allowing the experimenter to control which gate is executed in the system. Finally, quantum measurements can be achieved in these systems using the proven quantum jumping technique, which enables measurements under a computational basis with extreme precision.

Another type of quantum information processing scheme is based on nuclear magnetic resonance, often known for its acronym NMR. These schemes store quantum information in the nuclear spin of the primordial primordial molecule and manipulate that information using electromagnetic radiation. Such a scheme poses special difficulties, as it is not possible to operate a single core directly in NMR. Instead, a large number (usually about 10^15) of essentially identical molecules are stored in solution. An electromagnetic pulse is applied to the sample so that each molecule responds in much the same way. You should think of each molecule as a separate computer, and the sample as containing a large number of (classically) parallel computers. Nuclear magnetic resonance quantum information processing faces three special difficulties that make it very different from other quantum information processing solutions. First, these molecules are usually prepared by equilibizing them at room temperature, which is much more energetic than a typical spin flip, making the spin almost completely randomly oriented. This fact makes the initial state more "noisy" than required for quantum information processing. How to overcome this noise is an interesting story we will tell in Chapter 7. The second problem is that there are far fewer types of measurements that can be performed in NMR than we would expect to use in quantum information processing. However, for many instances of quantum information processing, the allowable measurement classes in NMR are sufficient. Third, because molecules can't be processed alone in NMR, you might ask how to manipulate individual qubits in an appropriate way. Fortunately, different nuclei in a molecule can have different properties that allow them to be processed individually—or at least at scales of fine-grained enough to allow for the operations required for quantum computing.

Many of the elements needed to perform large-scale quantum information processing can be found in existing schemes: superb state preparation and quantum measurements can be achieved on a small number of qubits in an ion trap; excellent dynamic evolution can be performed in small molecules with NMR; and manufacturing techniques in solid-state systems can enable designs to be massively expanded. A single system with all these elements would be a long road to the dream quantum computer. Unfortunately, these systems are very different, and we're still many, many years away from having large quantum computers. However, we believe that the existence of all these properties in existing (albeit different) systems bodes well for the existence of large-scale quantum information processors. In addition, it suggests that there may be advantages to promoting hybrid designs that combine two or more good features of the prior art. For example, much work has been done on the incarceration of atoms in electromagnetic chambers, which makes it possible to flexibly manipulate atoms inside the cavity through optical techniques, and real-time feedback control of single atoms in a way that is not possible in conventional atomic traps.

Finally, be careful not to think of quantum information processing as just another information processing technique. For example, it's easy to think of quantum computing as another technological trend in the development of computers that, like other technological trends, will fade over time. For example, "bubble memory" was widely advertised as the next generation of storage technology in the early 1980s. This is a mistake because quantum computing is an abstract paradigm for information processing and may have many different implementations in technology. One can compare the technical merits of two different schemes of quantum computing — it makes sense to compare the "good" scheme with the "bad" scheme – and in any case, even a very bad scheme for a quantum computer, which has a qualitatively different nature from a superbly designed classical computer.

This article is selected from Chapter 1 of Quantum Computing and Quantum Information (10th Anniversary Edition, February 2022, Electronic Industry Press) with the serial number of the section in the article being "Return to Simplicity".

Special mention

1. Enter the "Boutique Column" at the bottom menu of the "Return to Simplicity" WeChat public account to view the series of popular science articles on different topics.

2. "Return to Park" provides the function of retrieving articles on a monthly basis. Follow the official account, reply to the four-digit year + month, such as "1903", you can get the index of articles in March 2019, and so on.

Copyright note: Individuals are welcome to forward, any form of media or institutions without authorization, may not be reproduced and excerpted. Please contact the background within the "Return to Pu" WeChat public account for reprint authorization.

Read on