Is quantum computer a real assistant or just a trendy technical fake?

Is quantum computer a real assistant or just a trendy technical fake?

Too much talk about quantum computing during several last years. Sometimes we can hear a new startler that some laboratory, at last, has been created a real quantum computer. But is it true? And, if it is, how can a quantum computer help us?


People like talking about strange things, and one of those kinds of things is a quantum computer. The topic is discussed in a variety of internet forums, information sites, and social networks.


Since quantum physics separation into a particular field from physic science in the early 1920s, scientists permanently talk about its undoubted utility for the human civilization. In its turn, the civilization doesn’t see this utility so clear (excepting positive mood provided by internet mems about Schrödinger’s cat hard karma). But scientists can’t stop. They believe that all that we need is a quantum computer. A lot of money is assigned for quantum computing exploring and development of the first computer that uses quantum technology. According to rumors, the amount for this development is much more tremendous than the funds for NASA programs. Ohh, one more fact — the first scientist who will create a working, fully-featured quantum computer will take a Nobel prize. So, it explains why even a bit of message concerned quantum computing causes a huge public reaction in media.


But if you think about it, do we really need a quantum computer? For what purposes? Is there so much information in the world that all the standard laptops, workstations and even powerful server clusters can’t process it?

Information evolution

First of all, note that any computer is just a tool for information processing.


The term “information” in general scientific interpretation is some signal change among objects, or entities groups, systems like living creatures, animals, plants, people, machines, biological cells, elementary particles and so on. Information is abstract by nature and needs to be processed in the form of data. These two concepts (data and information) match with each other — and it sounds more clearly.


Since humanity got a great gift from evolution, the second signal-system, or conversation, those information exchange abilities become much richer. Having a purpose of informing our vis-à-vis (sometimes even imaginary) we can use acoustic signals (talking, laughing, crying, growl, etc.), or various nonverbal signals (facial expression, gesture, body language). One more way is graphical data, like pictures, words composed of characters, numbers. By their nature Paleolithic cave-dweller petroglyphs are the same as a source code of the program, both of them transmit information to the recipient which can process it.


Perceiving the world and environment a human opened intellectual ability in the same. This kind of ability claimed much more intensive brain functions than a simple information estimation received over sensory organs. The more data is around us, and the more compute capacity is loaded. There is also such information, that brainpower is unlimited, but it’s difficult to argue, that the human brain takes a back seat after primitive calculators from the XX century, comparing by processing speed rate. But we need output immediately after received information has been analyzed. That’s why people from time immemorial have been searching for a tool, that helps to get a result of computing fast.

From bit to qubit

So, information evolution resulted in computers evolution. We learned at school the progress way of calculation tools from abacus, Napier’s bones and slipsticks to modern electronic computers using silicon transistors. At the same time, we met a binary code as the classic computing basis that has a basic unit of information (bit) that represents a logical value, having only one of two values 1 (yes/true) or 0 (no/false).


Information overloads our world now, and the trend is to rise it more and more. According to IDC forecast, just in six years, a total data volume will reach up to 163 zettabyte (ZB), or 163*10²¹ bytes. In the same time, we remember that 1 byte contains 8 bit. Just imagine, all this colossal information pool is around us: data produced and processed by our home computers and cloud servers, industry automats and IoT devices, netbooks and smartphones, ATMs and global corporate info systems, smart clocks, and mining farms. All of them are built on silicon semiconductor chips and use a binary code to calculate.


This multibillion army of chips solves not only practical tasks but abstract-theoretical, too. Among them are astronomic, quantum physics, math, and cybernetics topics. We can’t avoid academic tasks because they help us to make a scientific discovery that can be used later in practical life. So, we should decide those tasks, like synthetic molecules with the preset characteristics prototyping, or creating new crypto algorithms with strong tamper-resistant 512-byte keys, or fit together genomic combinations to adjust down oncogenesis. At least, genomic sequencing is the time-consuming and computer-intensive operation!


Today our computers can make this kind of calculations, but they do it slowly when we need a result right now. Maybe, we should revise a calculation principle completely. As it was told earlier, information has only two values — 0 or 1 — in binary code. That’s why classical computers use a ‘linear’, one-by-one approach to computing, due to silicon-based chips ‘nature’. In case when data pool is a massive this linear principle involves huge time and computing resources. Sure, in this case, we can use multi-core processors and multi-processor servers, but we can’t solve the linear principle problem in general.

The Bloch sphere
The Bloch sphere
“Look, I have a superposition!” — says the Quantum Physics. — “And I know how to parallelize computing and cut time costs…”
Quantum superposition allows proceeding some data streams in real time. In spite of the binary system, quantum has three values: 0, 1 and both 1+0.

This principle is the basis of the math model of quantum computer developing. A basic information unit of quantum computing is a qubit (a quantum bit) — in opposition to a bit of “classic” computing. A qubit can have the following values: |0⟩, |1⟩ and |ψ⟩=α|0⟩+β|1⟩ — this is a superposition. Graphically qubit can be visualized as a vector from the center of the Bloch sphere directed to a surface point, and this is one more difference from “classic” bit. This way of computing process split to simultaneous but independent streams.


Practical value — true or false

We have been found a solution to processed a vast information volume simultaneously in some streams. To do this, we should transform our data into a quantum form, process them as we need, transform them back into a usual form, enjoy a quick and effective result.


Well, the power of quantum computing is really great. But has its practical value or it’s just a fancy device? Believe it or not, we have enough examples of practical needs which quantum computer can meet, and some of them are the following:

  • cryptography, cybernetics and software development;
  • manufacturing operation management;
  • stock exchange analytics and forecasts;
  • the maximum accurate long-term meteorology and seismology forecasts;
  • real physical and chemical systems modeling and simulation, including new molecules building with bond strengths between atoms, electrons, and other elementary particles calculation;
  • creating new compounds, matters, medical products with preset characteristics for chemistry, pharmaceutics, materials science and engineering;
  • complex tasks solution and system modeling in maths, physics, aerospace industry and so on.


This kind of tasks (mentioned above) needs simultaneous concerning a lot of different interacting variables, and if we use linear-principled processing, it will take many years or even centuries. But quantum computer spends less time for solving these tasks effectively due to the following effect. Qubits interact logically, and N qubits proceed 2N potential values simultaneously, that’s why it is quite possible to find the right result at once using N qubits. This effect is the most convenient illustrated by math factorization, or decomposition of large integers into simple factors.

Look, the chip is golden!

Intel CPU (49 qubits)
Intel CPU (49 qubits)
Certainly, we need another processor for quantum computing, not “classic” one. Unlike silicon-based chips, a quantum processor prefers gold as a material. Like any “classic” chip, the quantum chip has connectors for information exchange, e.g. execution program uploading and downloading of its result. Once upon a time, it has been found that gold is the best material for connectors manufacturing due to its conductive characteristics. As a result of cooling to ultra-low temperatures, the chaotic movement of molecules in materials is slower. In this case, the state of the qubit affecting and useful information distorting thermal noise is minimal.

The thing is quantum systems need sufficiently long coherence time for information uploading, processing, and result output before noise and interference mispresent data inside of qubits by quantum world rules. Additionally, an error correction algorithm needs to be created for full-fledged quantum computer (with 5–100 qubits at least) development. But in current circumstances, it is impossible until the coherence time problem solution will be found. All computer types that we have now are some experimental prototypes with 5, 16 and a bit more qubits. The D-Wave quantum computer that was introduced to a broad audience in 2007 has 16 qubits. And also the IBM Q Project is really working and gives everyone access to testing via Cloud.

Quantum is elusive or illusion?

IBM Q interface
IBM Q interface
Today we haven’t got a full-fledged quantum computer. Some companies suggest to test many prototypes with the quantum “logic” imitation, but in fact, all of them are emulators. Scientists and researchers from different countries institutions try to force big structures to act in a logical sense as elementary particles. Can we consider this emulation true? Tech idea of qubit creation as “artificial” atom in zero temperature (by Calvin scale) use IBM and some other companies. This type of cooling is necessary due to ultra-low temperature, that can provide you with superconduction effect, and the system stability time could rise. We can also learn from media about some researchers attempt to create an optical computer based on photon (quantum of light) and some other approaches.

However, we should remain, that the quantum computer is a case of the future. In 2017 some media announced that quantum supremacy, at last, has been achieved, but it is too early to look for a personal quantum computer in a techno-market. After a while, quantum computers will be placed in researching centers and laboratories, data centers and factories to carry out their specific missions (optimization, cryptography, new materials, and molecules creating, etc.) faster than usual computers. And our ordinary “classic” computers will save their place referring to most routine tasks (servers, applications for office and home, entertainment, multimedia) as the most effective, compact and cheap solutions.

Author: Alisa Kandeeva




Based on information from:

Share this: