My article in BrandsIT dated June 19, 2018.
Why, despite the fact that quantum computers have been known since the 1980s, it was only in the last few years that they began to arouse great emotions? Experts even say that we are getting closer to the border when it comes to the speed of computational processes. Modern processors consist of billions of transistors of several nanometers grouped on a very small surface. According to Moore’s law, the number of transistors in the microprocessor doubles every two years or so. Unfortunately, the increase in computing power in processors is constantly slowing down. We are slowly reaching technological limits in the possibilities of “packing” more and more transistors on such small surfaces. The limit, which can not be physically crossed, is a single-atom size transistor and a single electron used to switch its state from 0 to 1.
Where do the great advantages of a quantum computer come from? The easiest way to explain this is by comparing it to a traditional computer. The device, which we know from everyday work, performs all operations, using basic IT units such as bits. And these can in principle only represent two states: 0 and 1. In the case of a quantum computer, we are talking about the use of an intermediate state, i.e. going beyond the scheme of two opposite values. Kubit (from quantum bits) – because it is called a unit of quantum devices – can simultaneously take the value of 0 and 1, and exactly can take an infinite number of states between 0 and 1. This state is called a superposition. Only when checking the value of qubit, it takes one of two basic states – 0 or 1. It seems that this is a small difference, however, the qubit located in a superposition state can perform many commands at the same time during calculations. The basic principles of quantum physics help us here. Physically, qubits can be represented by any quantum system with two different fundamental states, for example an electron or atom spin, two energy levels in an atom or two levels of photon polarization – vertical and horizontal. While in the classical computer the bit stores two values, two bits store four values, etc., two qubits store not one, and four values at the same time. The direct consequence of this fact is that the quantum computer can perform many operations simultaneously, which a traditional device can not. To make it even more specific: the quantum state in question leads us to a situation in which the machine processes powerful data sets in an unimaginably short time. Imagine such a large collection that their processing would require millions of years if we used traditional computers for this purpose. This completely abstract situation becomes reality when we start talking about the use of a quantum computer. It can count up to hundreds of thousands – and in the assumption – millions of times faster than devices built on advanced silicon components! The ideal application for such a machine is to recognize objects from a huge stock of photos, calculations on large numbers, or encryption and breaking of ciphers. Using mathematical data, this theoretically increase the difference in performance between a quantum computer and a traditional computer to the level of 1:18 000 000 000 000 000 000 times!
Unfortunately, qubits must be completely isolated from the environment, because they are very unstable and can be destroyed, among others, by changes in ambient temperature, external radiation, light or collisions with air molecules. That is why vacuum, super-high temperature and full environmental isolation are necessary. To bring up the interesting and controversial phenomenon we are dealing with, I will give examples of extreme reactions of representatives of the IT industry. Not so long ago, one of Google’s representatives said that the quantum D-Wave computer solved the problem in 1 second. The standard one would probably need 10,000 years for it! On the other hand, there are often opinions like the one made by physicist Matthias Toyer. When three years ago it was announced that the computer The D-Wave2 in a special test solved the task assigned to him 3,600 times faster than a traditional computer, the scientist questioned these results showing that it is difficult to talk about such efficiency, because no evidence confirms it. The best summary of the confusion in this field may be the words of an employee of the National Institute of Standards and Technology in the USA, David Wineland, who said: “I am optimistic that we will achieve success in the long run. However, this “longer time” means that I do not know when it will happen. ”
Link to the full article (in Polish)
– The invisible web that surrounds us, i.e. the internet of things
– According to our computers … You don’t exist
– Blockchain – the Holy Grail of the financial system?
– A hidden social networks lurks within your company. Find it!
– Blockchain has a potential to upend the key pillars of our society
Quantum computers will be necessary to deal with the most complex AI applications especially in Healthcare, Fintech, Security, etc. Very thorough article Norbert!
I think it is very dangerous to package the gate model of quantum computing in with the things that quantum effects can actually accomplish. Using statistical measurements quantum particles to secure a communications channel, generating random numbers, etc. Both are extremely important for cryptography. Adiabatic optimization is also a promising field, but we still haven’t gotten to a point where we can prove that it works better than regular optimizers. What is scary is how the gate model – the most flexible of quantum technology but the least likely to be feasibly implemented – is packaged in with all these other things. You’re packaging two proven successes with one likely success with one not-yet success. Basically, you’re doing with ideas what Wall Street did with mortgage-backed securities – stuff the thing full of badness, and sprinkle in a little goodness.
That is incredible. Back in the day, this is how big some of the first computers looked like, now we carry them in our pockets. I wonder how long it will be before people see pictures like this and think this is old technology.
Several major applications have already been identified, including secure communication, clock synchronization, extending the baseline of telescopes, secure identification, achieving efficient agreement on distributed data, exponential savings in communication, quantum sensor networks, as well as secure access to remote quantum computers in the cloud.
Thank you! The article was very informative, but what seemed a bit unclear was how the ‘Quantum transmission’ takes place? Does this mean, a sender ‘generates’ a pairing for two particles, entangled, then keeps one particles ( a photon?) and transmits the other (entangled photon) through fiber cable to the receiver. The particle received at the receiver end now could be ‘tempered with’ at the sender’s end?
Impressive progress has been made recently in building quantum computers, and quantum machine learning techniques will become powerful tools for finding new patterns in big data.
But after I started reading the article my enthusiasm quickly died down as noticed they are NOT talking about “things impossible for classical ones”, that this is not about greater than Turing completeness, instead it is just about good old Quantum Advantage. Don’t get me wrong, proving Quantum Advantage is cool, but is not earth-shattering as the greater than Turing completeness that title implied.
Quantum computers are good for modelling probabilistic processes because they are probabilistic systems.
your qubits are in superposition ideally with a proportion mirroring the p and p-1 of your problem. “Heads” means a successful outcome, tails means an unsuccessful outcome – but you need a large enough sampling of qubits to be representative of the system you are trying to model.
Right now we can only model simple systems like chemistry in a very small scale (A few atoms, a few molecules) because we have relatively few qubits, and so consensus is weak, and so we need to measure systems with both a limited number of potential outcomes, and a limited sampling space (A small “Omega” – sorry I have a laptop without a numpad and don’t know how to do unicode any other way)
Quantum computing is expected to solve computational questions that cannot be addressed by existing classical computing methods. It is now accepted that the very first discipline that will be greatly advanced by quantum computers is quantum chemistry.
A quantum algorithms exists for doing it and no one has yet produced a classical equivalent. It’s not proven that no classical algorithm exists, but I didn’t claim that either. The point is that as far as we know quantum computer can do it and a classical one cannot.
It’s an important point because “formal proof of a long held conjecture” is different from “proof of something absolutely new.”
Non-quantum algorithms that are used to analyze the data can predict the state of the grid, but as more and more phasor measurement units are deployed in the electrical network, we need faster algorithms. Quantum algorithms for data analysis have the potential to speed up the computations substantially in a theoretical sense, but great challenges remain in achieving quantum computers that can process such large amounts of data.
There seems to be a theoretical limit to any Moore’s Law-type trend that will apply with quantum circuitry. And Moore’s Law is already exhausted for classical circuits without even needing a theoretical reason to be exhausted. There’s no reason to think spatial density of switches is a chief concern for the efficiency of quantum computers. Unless you’re inferring some analogous exponential scaling trend for the size of dilution refrigerators, which would make more sense.
We are now at an exciting moment in time, akin to the eve of the classical internet. Recent technological progress now suggests that we may see the first small-scale implementations of quantum networks within the next five years.
Nice. It’s not brute forcing it, it’s just using different algorithms that take advantage of the fact that with n qubits you have all 2n possible combinations in superposition with one another. So in effect, you have an exponentially large state space that can be used to explore ‘correct’ answers. So, something that is exponentially hard (i.e. add a bit and it doubles required computation each time) in theory can now be run in polynomial time (i.e. add a bit and computation time goes up the same amount for each bit). So now with the right sized quantum computer you could in theory crack elliptic curve crypto.
The kicker is that the qubits everybody refers to when talking about this are perfect error free qubits. Such a logical qubit will likely take ~10,000 physical qubit to create. And you’ll need ~100,000s of them, so a system of about a million physical qubits. And we have 50. So there’s that.
Ugh, lets just hope that there are techniques to design quantum computers, that are sped up by quantum algorithms.
Sounds like profitable nuclear fusion: ready in 20 years
Will be fascinating to see if the scaling problems can be overcome – MIT described it as potentially an inverse Moore’s law – every doubling of qubits will take twice as long and cost twice as much.
A quantum computer exploits quantum physics to rapidly uncover the right answer to a problem by sifting through and adjusting probabilities, while a classical computer will be burning up memory and time looking at each potential answer in turn.
I think this counter-intuitive “truth” is one of the things that most people are passing over. While, for many of us, the development of the technology is a mystery, the fact that it appears so simply and regularly in our lives has made it the most easy to accept. Mostly with technologies so advanced like quantum computing. Point me sb who understand quantum physics 🙂
Apples and Oranges my friend – transistors do not need to be near absolute zero to operate, nor are they subject to time based decoherance (among MANY other differences). On a similar theme, Moore’s law effectively ended for silicon when Apple reached 10nM manufacturing standards, meaning that trace widths are ~ tens of copper atoms wide.
What could we possibly need home quantum computers for? Everything is moving to cloud computation. Your phone will be the receiver and it will link seamlessly to a keyboard and mouse and monitor through Thunderbolt X.1.
You wont have a need for a computer tower in the future unless youre one of those “physical hardware hackers”. Its like you dont even enjoy the fingerchips that allow typing on any surface at any angle.
Keep up with the times
And the final stage will come when these quantum computers finally surpass their conventional cousins, making it possible to create distributed networks of computers capable of carrying out calculations that were previously impossible, and instantly and securely share them around the world.
Nice beginning. Unfortunately the rest in a language I do not understand