The French government recently announced a major “Quantum Plan” of 1.8 billion euros, the majority of which will be devoted to quantum computers. French research thus intends to reserve a place of choice in this formidable scientific and technological adventure that many compare to the beginnings of microelectronics of the last century. It must be said that the forces involved are numerous and diverse, as can be seen on this graph.

The fundamental reason for this rush to quantum computing is the phenomenon of quantum superposition, which means that a particle can be in several quantum states at the same time.

Applied to computing, it allows to obtain naturally exponential computing power. A conventional register of N bits can store only one series of N binary values at a time. But a register of N qubits can, thanks to the superposition phenomenon, store all the combinations at the same time, i.e. 2^{N} series of binary values. Each series is associated with a certain probability. *“The art of algorithms consists in modifying the distribution of these probabilities to converge towards the correct answer by relying on a cascade of logic gates and the phenomenon of quantum entanglement”*, explains François Perruchot, expert in quantum engineering CEA-Leti. The goal is that when you read the register – an action that causes quantum superposition to vanish – only good results appear.

A first concrete application case appears with the discovery of Shor’s algorithm in 1994. It allows large numbers to be factored much more quickly. On a typical computer, the complexity of such an operation increases exponentially with the number of bits. If N is the number of bits, the number of elementary operations required will be of the order of 2^{N}. But with Shor’s algorithm, the number of operations will only be of the order of N^{3}. The acceleration is then exponential or “superpolynomial”.

### The “killer app” is chemistry

This discovery is of great interest to intelligence agencies, because it would make it easy to break public key encryption, such as the RSA algorithm, which protects all our communications. The best IT people have managed to do today is factorize an 829-bit RSA key, which used 2,700 Intel Xeon compute cores for a year. Today it is recommended to use keys of at least 2048 bits, which are impossible to break by today’s supercomputers. But in 2003, the researcher Stephane Beauregard showed that to factorize an integer of N bits, it suffices to have 2N + 3 logical qubits, that is to say approximately the double. To break a 2048-bit key, you would therefore need 4099 logical qubits.

But espionage, let’s be honest, doesn’t make people dream or explain the industrial craze we’re seeing in quantum computing. The area that stands to benefit the most from quantum computing is chemistry and materials science. This is what the physicist Richard Feynman was thinking when he first imagined, in 1982, the existence of a quantum computer. He then believed that only a quantum computer system would be able to simulate physical quantum systems.

Indeed, to simulate molecules and chemical reactions, it is necessary for example to take into account electron spins. These can only take two values (+1/2 and -1/2). The complexity therefore increases exponentially with the number of electrons (2^{N}). According to scientists, to simulate a relatively simple molecule like penicillin (41 atoms, 242 electron orbits), you would need a computer with 10^{86} transistors. Which is more than the number of atoms in the visible universe. With a quantum computer, on the other hand, 286 logical qubits would suffice in theory.

More interesting than penicillin is the FeMoco molecule. This is involved in the biological fixation of atmospheric nitrogen by bacteria in the form of ammonia. It is a fundamental reaction of nature, but still very largely misunderstood. The chemical industry has techniques for fixing nitrogen, but they are much less efficient than this biological process. In 2016, a group of researchers showed that the simulation of this molecule could be done by around twenty hours with 2024 logical qubits, using the Trotter-Suzuki algorithm. But it is not the only one. In this field of quantum simulation *“We have very efficient quantum algorithms and we are almost certain that there is no such efficient classical algorithm”*, highlighted Andrew Childs, professor at the University of Maryland, on the occasion of the Q2C 2020 conference.

### Algorithms in spades

An exponential quantum acceleration has also been proven since 2009 for the resolution of linear equations and, by extension, to differential equations, through the HHL algorithm (Harrow, Hasidim, Lloyd). The field of application of this type of algorithm is obviously very wide: industry, commerce, machine learning, etc. And in reality, there are many other quantum algorithms that are known to be more efficient than classical algorithms. A non-exhaustive list, but rather impressive, is available on the site Quantum Algorithm Zoo.

Among them, let us quote in particular the algorithm of Grover. Discovered in 1996, it allows you to search for an element in a set (the needle in a haystack in a way). In classical computing, the complexity of this task is of the order of N, the number of elements. A quantum computer can reduce this complexity to √N. We then speak of a polynomial acceleration. *“It’s much less impressive than Shor’s algorithm, but it’s already very surprising that we can achieve such a result in this area. Moreover, this is a problem that exists in many applications ”*, explique Andrew Childs.

### An ideal beyond reach

The problem is, not all of these algorithms are usable right now, because quantum computers are not yet efficient enough. The error rate is still far too high and the superposition phase far too unstable. Result: to obtain the equivalent of a logical qubit, it would be necessary to agglomerate hundreds or even thousands of physical qubits to erase all these imperfections. In 2019, researchers showed that a quantum computer could break a 2048-bit RSA code in eight hours … provided it had 20 million physical qubits. The same year, other researchers estimated that one could simulate the FeMoco molecule in the space of a week with the help of … a million physical qubits. As for solving linear equations, it requires loading a lot of data at the start, which engineers don’t know how to do at the moment.

At present, no one really knows when we will have such machines. We don’t even know if this will really be possible. However, the researchers do not want to stand idly by and try to implement degraded versions of these algorithms on the architectures available and under development, even if they are imperfect. In short, as long as we don’t have the correct quantum technology, we try to do with what we have. This is what specialists call the NISQ (Noisy Intermediate-Scale Quantum). For the simulation of molecules, there is for example the VQE algorithm (Variational Quantum Eigensolver). For optimization problems, there is the QAOA (Quantum Approximate Optimization Algorithm) algorithm.

The idea is excellent, but its outcome is very uncertain. *“It is very difficult to prove [mathématiquement] an acceleration in these cases. So in fact, we don’t really know ”*, explains Phillip Gerbert, Managing Director at Boston Consulting Group, on the occasion of the Q2B 2019 conference. Scientists and engineers therefore advance heuristically, that is to say by experience. That’s why Google and other manufacturers are partnering with companies that potentially use quantum computers. They can thus test their algorithms on real data and adapt them accordingly. *“If we find effective NISQ algorithms, it will probably be for very specific problems”*, said Ryan Babbush, researcher at Google Research, on the occasion of the Q2B 2019 conference.

For some, this situation is somewhat reminiscent of the field of artificial intelligence. Again, it has always been very difficult to prove anything in advance mathematically. The tremendous progress of the past years has been achieved gradually, through experience and testing on real data. Obviously, the protagonists of quantum computing hope not to undergo a crossing of the desert as for artificial intelligence during the 1980s and 1990s, where development completely stagnated. But nothing is less certain.

**Source:** Science, recherche – 01net **by** www.01net.com.

*The article has been translated based on the content of Science, recherche – 01net by www.01net.com. If there is any problem regarding the content, copyright, please leave a report below the article. We will try to process as quickly as possible to protect the rights of the author. Thank you very much!

*We just want readers to access information more quickly and easily with other multilingual content, instead of information only available in a certain language.

*We always respect the copyright of the content of the author and always include the original link of the source article.If the author disagrees, just leave the report below the article, the article will be edited or deleted at the request of the author. Thanks very much! Best regards!