Inevitable rise or distant dream?

The smaller computer parts get, the more efficient they become – that was the rule of thumb when it comes to computers and electronics in general.

But these parts have reached a state where they can no longer get any smaller without losing the properties used to build machines like modern computers, becoming a barrier to technological advancement.

Computers, and technological advances in general, are reaching this physical limit as processors, transistors, and other computer parts approach the size of an atom.

Modern electronics have silicon-based transistors as small as 10nm, which is almost a hundred times smaller than the size of red blood cells in the human body.

Even smaller than that, transistors quickly and easily begin to play with the laws of classical mechanics, since at that subatomic level classical properties on which modern computers are built are not maintained. This is where quantum mechanics comes into play.

For the uninitiated, quantum mechanics is the study of subatomic particles such as electrons, neutrons, and protons. In contrast to the physical objects that surround us, particles on the subatomic scale behave differently.

While bits, or binary digits, are the building blocks of classical computing, quantum computing uses much more efficient subatomic qubits for computation. Bits in classical computing can be either 0 or 1, basically an “on” or “off” switch for the transistor to either pass or block electrons. Qubits, on the other hand, can be any combination of 0 and 1.

Imagine a glass of lemonade where the lemon juice is 1 and the water is 0. The glass of lemonade is a solution of lemon juice and water, and until the solution is distilled in a lab, there’s no way of telling what ratio you’re in.

Qubits are like that. 1 and 0 both exist in some ratio in a qubit, and as with lab testing, they collapse to a stable state of either 1 or 0 only when the qubit is observed or measured, giving us an unequivocal result. This uncertainty of state is called quantum superposition.

Aside from this uncertain state, qubits are also mathematically entangled with the qubits in their vicinity. This means that when measuring when a qubit collapses to a 1 or 0 state, the state of the neighboring qubit will be affected by the result. This property is known as quantum entanglement. Because of this entanglement, measuring one qubit can tell us what state the neighboring qubits are in.

Quantum computers are built based on these two basic principles of quantum mechanics: superposition and entanglement.

The Nobel Prize-winning American physicist Richard Feynman first realized while working on one of his projects that classical computers are not scalable to handle complicated, especially quantum, simulations. He added that the two principles of quantum mechanics could be used to build a much better and more efficient computing system.

In 1986, Feynman introduced the early version of the quantum circuit notation, based on which Peter Shor developed a quantum algorithm in 1994. Later, Isaac Chuang, Neil Gershenfeld, and Mark Kubinec developed the world’s first known working quantum computing tool using just two qubits in 1991. Although it was a very early rendition of a primitive computing device, it was quite a leap in the advancement of this nascent technology .

Quantum computers are computing devices that control the behavior of particles at the subatomic level. Because the components and building blocks of quantum computers are orders of magnitude smaller than those of classical computers, they are exponentially faster and use only a fraction of the power that conventional computers require.

However, contrary to how they are portrayed in the sci-fi genre, quantum computers are not an upgrade of the classic computers we have in our homes. That’s because they work very differently than the computers we have now. They are also exponentially better at complex calculations than the supercomputers that most technology companies like Google, IBM and Microsoft use for their R&D.

Comparing classical computers and supercomputers with quantum computers would be like comparing bicycles with motorcycles. Classic computer upgrades often refer to multiplying capacity or efficiency. A decade ago, 1GB of RAM was enough for a PC. But now the 2GB RAM is the bare minimum in modern computers, that’s two 1GB RAM bundled together.

Unlike the RAM in classic computers, no matter how many bikes are bundled together, they cannot become a motorcycle, as motorcycles are much more efficient and function differently than bicycles. The same applies to quantum computers, since they are fundamentally different from conventional computers.

That’s why physicists and researchers behind this technology insist that quantum computers are not an upgrade from supercomputers, but an entirely different superclass of computers that will change the course of computing algorithms for the future.

These computing devices are so advanced that they take a fraction of the time and energy to solve a problem that even modern supercomputers take hours to solve. A simple example would be how efficient they are at a database search.

For example, if there is a database with 1 trillion names and a search is performed, classical computers and supercomputers compare every single name in the database to the search, which means a trillion operations for just a simple search.

On the other hand, using the properties of qubits, a quantum computer can perform the same operation in significantly fewer steps. For the same search operation with 1 trillion names, quantum computers would only need to perform 1 million operations, which is a million times fewer operations than classical or supercomputers would require for the results.

What supercomputers can do, quantum computers can do with a fraction of the resources. However, progress in this technology has been slow. Although companies like IBM, Google and Microsoft have invested heavily in the development of quantum computing tools in recent years, we are nowhere near a full-fledged prototype for commercial or personal use.

News of prototypes from several Chinese and American researchers break out every few years. Still, we came closest to a quantum computer when Google AI partnered with NASA in October 2019. They claimed to have performed calculations at a quantum level that is seemingly infeasible on any classical or supercomputer. But even this claim is questioned by many.

Of course, the commercial and private use of quantum computing is a dream for the distant future, especially since exploiting the quantum properties of particles at the subatomic level, unlike the classical computing components we use, can only be possible in a controlled environment. However, in a decade or two, primitive quantum computing tools could feed into various research and simulations that will give us a closer look at atoms and molecular structures.

This level of intricate insight and powerful calculations will help the medical and nutritional industries better understand the elements. Any industry or branch that relies on research and simulation would benefit greatly from this hyper-efficient computing technology. These include space exploration, manufacturing, engineering, molecular analysis, cryptography, chemical engineering, etc.

Cybersecurity or encryption is another sector where quantum computing will break the norm and revolutionize it. Thanks to the quantum uncertainty of qubits, deciphering the encryption from a quantum computer would be nearly impossible.

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée.