Publication on quantum computing has increased steeply since the early 1990s
Publication is international and significant investments are happening
The top 10 institutions with the highest publication output are located in China, France, Canada, the US, the UK and Singapore
The two most prolific authors are from Chinese and US institutions
Learn more about tracking current R&D priorities while also monitoring influential trends
Are we at the brink of a second quantum revolution?
Quantum computing is a hallmark "moonshot" of a second quantum revolution. Whereas the first revolution enabled lasers and transistors based on rules of quantum mechanics, the second aims to control quantum systems.1,2 Quantum computing uses quantum phenomena to perform computations. Quantum computing is part of the broader area of quantum information technologies or quantum technologies. Quantum technologies seek to understand how quantum phenomena can be used in computing, communication, sensing and metrology to go beyond what classical systems can do.
For example, qubits3 — the quantum version of the classical bit — are not restricted to the pure 0 or 1 of binary digits. Physically a qubit could be realized using the polarization of a single photon, or electron spin, or currents in a superconductor. At the end of a calculation when the answer is read, each qubit will come out as either 0 or 1. But during the calculation the qubit represents both 0 and 1 simultaneously. This is called superposition. This core aspect of quantum computing enables quantum computers to outperform classical computers in some problems requiring intense computation. A second crucial aspect is entanglement, or the superposition of a pair of more than one qubits. This allows a quantum computer to explore a much larger computational space than any classical computer can explore.
The second quantum revolution is where you’re really using the quantum mechanics to do everything for you.
Ray Simmonds, U.S. National Institute of Standards and Technology1
Quantum computing has made a big splash across the news in the last few years — from Google’s 53-qubit quantum computer “Sycamore” achieving what has been coined quantum supremacy to multibillion-dollar initiatives around the world to develop quantum technologies for computing and beyond. The race is on.
This report is based on research and analysis conducted in Scopus, an expert-curated abstract and citation database, in February 2021.
The search was conducted assuming a limited knowledge of the topic and interest in research activity around possible applications of quantum computing. It began with "quantum comput*" to combine a loose phrase with a wild card (*), which accommodated word variations like computation.
To broaden the search slightly, we added the terms "quantum algorith*", "quantum simulation*", "qubit" and "quantum bit" to arrive at the search used for this research trends report:
TITLE-ABS-KEY ("Quantum comput*") OR TITLE-ABS-KEY ("Qauntum algorith*") OR TITLE-ABS-KEY ("Quantum simulation*")) OR (TITLE-ABS-KEY ("Qubit*") OR TITLE-ABS-KEY ("quantum bit*")
This report provides search results for document titles, abstracts and keywords. Scopus indexes content from 25,000 active titles and 7,000 publishers, as well as patents and conference abstracts. Scopus enriches data, metrics and analytical tools inform both research and business strategy, driving better decisions and outcomes.
The search will pick up some documents in related areas such as quantum communication, but also exclude some of the major works in these areas. It could be further refined to include or remove a portion of other areas, such as "quantum simulation", "quantum communication", "quantum cryptography", and "quantum sensing". Am ore extensive search can be done to cover the broader area of quantum technologies, but here the focus is more on quantum computing, including quantum simulation.
A brief history of quantum computing
In 1982, the physicist and 1957 Nobel Laureate Richard Feynman discussed a machine that would operate on quantum mechanical principles to simulate the behavior of one quantum system using another quantum system - a quantum simulator.4 In 1985, David Deutsch of oxford University further advanced the field by proposing a quantum Turing machine (based on the pioneering work of Alan Turing on what constitutes a general computer) and specified an algorithm designed to run on a quantum computer.5 Beyond the realm of researchers in quantum physics and theoretical computer science, the field really took off in the mid-90s. In 1994, the mathematician Peter Shor proposed an algorithm for a real world "killer application" of quantum computers. It would factorize large numbers into their prime number counterparts exponentially faster than possible with a classical computer.6
Why was Shor's result so important? Much of modern encryption, such as RSA encryption, is based on the idea that it is very difficult to factor a large integer that is the product of two large prime numbers.7 While it is easy to take two primes and multiply them together to form a big number, doing the inverse from a large number to find its constituting unknown primes is hard. In fact, so hard that a classical computer might take a lifetime or more to perform the calculation. However, on a quantum computer, if it can be made to operate with a large number of qubits, finding prime numbers could be done efficiently (mathematically in polynomial time), putting the basis of much modern encryption used on the internet at risk, If that hard problem can be solved, what other hard problems might become much easier? Actually, quite a few, including hacking bitcoins and other cryptocurrencies.
Publication rate on quantum computing has steeply increased
Scopus search results show a steady increase in research output in quantum computing and related areas, really kicking off in 1994. The growth increases steadily resulting in over 48,000 publication. Especially from 2015 onward there is sa steeper rate of publication.
Concurrent with the faster research developments, achievements in quantum computing in terms of number of qubits used for computations emerge:
2012 — A group at University of Bristol factored the number 21 with Shor's algorithm8
2017 — D-Wave Systems announced the first sale of its D-Wave 2000Q quantum computer; it is not a general purpose quantum computer, but can address optimization problems9
2020 — Google accurately simulated the binding of hydrogen chains and isomerization of diazene using Sycamore10
A significant hurdle in quantum computing
Progress in quantum computing and the broader area of quantum technologies has been quite remarkable. However, we still may face significant hurdles before a general purpose quantum computer is available to tackle, for instance, really large scale factorization into prime numbers or computational tasks, optimization problems excluded. The challenge is that a quantum computer must do its computation, the superposition that allows for the computational advantage breaks down, a process called decoherence.11 The more qubits used, the more susceptible to noise the superposition becomes. This is why real-world quantum computers have been so hard to build.
Physical implementations of quantum computers today, such as those using semiconduction technology, must operate at close to absolute zero temperature in a highly isolated environment. Related areas like quantum communication (notably quantum cryptography) and quantum sensing do not rely on a large number of qubits to operate. As an interesting twist, while quantum computing may potentially make current encryption schemes unsecure, quantum cryptography would be a way to provide fundamentally secure encryption schemes.12
Are there potential solutions to make computer more resistant to distubances? In classical computing and communication, error correcting codes are introduced to correct for noise; for instance, if someone cannot hear what you say, you repeat your message. Impressive work has been done to generalize the concept of error correction codes into quantum error correction for an overview, see the now classical textbook by Michael Nielsen and Isaac Chuang.13 To correct errors in one qubit, however, requires the use of several qubits, and the technological race is still to scale up quantum, computers for large scale operations. Physicist John Preskill called this the Noisy Intermediate-Scale Quantum (NISQ) era as physicist and engineers are still working hard to solve all technical challlenges.
As quantum computing moves toward real-world applications, it continues to be an area for fundamental new discoveries in physics. In 2012, Serge Haroche and David Wineland were awarded the Nobel Prize in Physics "for ground-breaking experimental methods that enable measuring and manipulation of individual quantum systems".14
Their work has profound implications for quantum information and quantum computing:
Their ground-breaking methods have enabled this field of research to take the very first steps towards building a new type of super fast computer based on quantum physics. Perhaps the quantum computer will change our everyday lives in this century in the same radical way as the classical computer did in the last century." - NobelPize.org
Top quantum computing publication types are journal articles and conference papers
Quantum technologies/computing research outputs with Scopus are primarily journal articles. This reflects a long history of basic research in physics, with many well-established journals (more on journals below). The novelty and rapid evolution of the field is reflected in the high proportion of publications emerging from conferences (21%), which is about twice as high as the overall proportion of conference papers in Scopus(11%). It is notable that the quantum technology community was early adopters of using the ArXiv preprint server to share their works.
Quantum computing research has an international footprint
The 10 institutions with the highest publication output are located in China, France (via the CNRS laboratories), Canada, the US, the UK and Singapore. The Chinese Academy of Sciences shows particularly hight output. China's overall spending on quantum technologies, including quantum computing is not public information. However, the government's interest in funding quantum technologies is clear from the-Five-Year Plan 2016-202 and a 10 billion USD investment to build the world's largest quantum research facility.15
The majority of author affiliations are academic and governmental research institutes. Private sector enterprises appear much further down the list with companies such as Nippon Telegraph and Telephone (NTT), IBM Thomas Watson Research Center, and Microsoft Research.
Top Contributors to quantum computing
The two first authors on the list are Professors Guang-Can Guo and Franco Nori. Prof. Guang-Can Guo(opens in new tab/window), University of Science and Technology, Hefei, China, publishes on the broad topics of quantum information and has many publications in photonic quantum computing.
Physics, optical sciences and engineering journals lead in quantum computing publications
Physics, optical sciences and engineering journals top the list of journals leading in quantum computing publications. The journal with the highest number of publications in the field is Physical Review A. Over the last decade, the journal has undergone topical restructuring that led to a change in statement scope in the 1990s (Atomic, Molecular and Optical Physics) and then the explicit inclusion of quantum information in Physical Review A as of 2016. The Journal Quantum Information Processing was first published in 2002 and is a relatively new contributor to the body of literature.
Do you need to follow developments in quantum computing and the broader area of quantum technologies?
Quantum technologies, not only quantum computing, are shifting towards enabling real-world uses. Quantum cryptography is already seeing some commercial applications. Research and quantum technology applications are fueled by strategic government investment in the US,19 the EU20 and China.14 Industry initiatives include the Pistoia Alliance21 and IBM's Quantum Summit 2020,22, where, among others, IBM discussed their ambition to build a 1000 bit quantum computer by 2023.23
Technology business journalist Kara Swisher once asked Google CEO Sundar Pichai it was possible that Google missed the Cloud service trend in the early 2000s and allowed Amazon a head start. (Google has invested heavily to catch up.) Pichai's answer was that they had been busy doing other things.
Don't miss out on an important research trend. Bookmark this page for updates and a head start in planning for quantum computing.
Business Intelligence for science and technology
With Scopus, track your current R&D priorities while also monitoring influential trends. This rich abstract and citation database provides insights into comprehensive interdisciplinary research and analytics. Gain scientific knowledge and competitive intelligence as you:
Track research trends Get alerts of breakthroughs that can impact your business
Analyze output of key contributors Find collaborators or competitive white space
Explore relevant and trusted research Access a comprehensive knowledge base
Would you like to learn more about how Scopus can benefit your research and business?
J. Dowling and G. Milburn, “Quantum Technology, the Second Quantum Revolution(opens in new tab/window),” Phil. Trans. R. Soc. Lond. A (2003).
NIST. 2018. The second quantum revolution(opens in new tab/window). Accessed September 2020.
The word qubit coined in 1995 by B. Shumacher and W.K. Wooters,see https://quantumfrontiers.com/2015/06/09/who-named-the-qubit/(opens in new tab/window)
Feynman, R. 1982. Simulating physics with computers. Int. J. Theor. Phys. 21: 467.
Deutsch, David (1985). "Quantum theory, the Church-Turing principle and the universal quantum computer." Proceedings of the Royal Society A. 400(1818): 97–117.
Shor, P.W. (1994). "Algorithms for quantum computation: discrete logarithms and factoring". Proceedings 35th Annual Symposium on Foundations of Computer Science. IEEE Comput. Soc. Press: 124–134.
For an explanation of RSA encryption, https://www.abc.net.au/news/science/2018-01-20/how-prime-numbers-rsa-encryption-works/(opens in new tab/window), accessed Feb. 2021.
Martin-Lopez, E. et al. 2012. Experimental realization of Shor’s quantum factoring algorithm using qubit recycling. Nature Photonics 6: 773.
D-Wave Systems. 2017. Temporal Defense Systems purchases the first D-Wave 2000Q quantum computer(opens in new tab/window). Accessed September 2020.
Google AI Quantum and Collaborators. 2020. Hartree-Fock on a superconducting qubit quantum computer. Science 369: 1084.
W. Zhurek, Decoherence and the Transition from Quantum to Classical” Physics” Today 44, 10, 36 (1991); https://doi.org/10.1063/1.881293(opens in new tab/window).
For a brief on quantum cryptography, see for instance ETSI White Paper No. 8
Quantum Safe Cryptography and Security An introduction, benefits, enablers and challenges(opens in new tab/window) June 2015 ISBN No. 979-10-92620-03-0.
Michael A. Nielsen and Isaac L. Chuang. Quantum Computation and Quantum Information. Cambridge University Press (2000).
Nobel Prize in Physics 2012, Press Release(opens in new tab/window).
IDQ. 2018. China’s growing investment in quantum computing(opens in new tab/window). Accessed September 2020.
Marr, B. 2017. 6 Practical Examples Of How Quantum Computing Will Change Our World(opens in new tab/window). Accessed September 2020.
Drug Discovery Online. 2020. Researchers use NSF Convergence Accelerator to shorten drug discovery timeline(opens in new tab/window). Accessed November 2020.
Alvarez-Rodriguez, U. et al. (2918) Quantum artificial life in an IBM quantum computer. Scientific Reports 8: 14793.
Kratsios, M. and Liddell, C. 2020. The Trump administration is investing 1 billion USD in research institutes to advance industries of the future. Accessed September 2020.
European Commission. 2018. Quantum Technologies Flagship kicks off with first 20 projects(opens in new tab/window). Accessed September 2020.
Pistoia Alliance. Almost one third of life science companies set to being quantum computing evaluation this year(opens in new tab/window). Accessed September 2020.
Annunziata, A. 2020. IBM Quantum Summit 2020: Exploring the promise of quantum computing for industry(opens in new tab/window). Accessed November 2020.
Adrian Cho. IBM promises 1000-qubit quantum computer — a milestone — by 2023(opens in new tab/window), Science, Sep. 15, By Adrian ChoSep. 15, 2020.