My article in FORBES on March 13, 2018.
An unquestionable pioneer of technology behind quantum computers is the Canadian company D-Wave Systems. Its clients include both government agencies such as CIA, NSA, science centers, NASA, as well as commercial entities, such as Google and Lockheed Martin – you can read on their website. The European Union also intends to spend a billion euros on similar research. Technological companies develop their own technologies, seeing the wide applications for huge powers dormant in calculations based on quanta, the basic particles from which our world is built.
The electrons are too crowded. Moore’s Law
Why is this technology so breakthrough? Modern processors consist of billions of transistors of several nanometers, grouped on a very small surface. According to Moore’s law, the number of transistors in the microprocessor doubles every two years or so. Unfortunately, the increase in computing power in processors is constantly slowing down. We achieve technological boundaries in the possibilities of “packing” more and more transistors on such small surfaces. The limit, which can not be physically crossed, is a single-atom size transistor and a single electron used to switch its state from 0 to 1.
Technology from a cold vacuum
The advantages of a quantum computer can be shown simply by comparing it to a traditional computer. The device, which we know from everyday work, performs all operations, using basic IT units such as bits. And these can in principle only represent two states: 0 and 1.
In the case of a quantum computer, we talk about the use of an intermediate state, i.e. going beyond the scheme of two opposite values. Kubit (from quantum bits) – because it is called a unit of quantum devices – can simultaneously take the value of 0 and 1, and exactly can take an infinite number of states between 0 and 1. This state is called a superposition. Only when checking the value of qubit, it takes one of two basic states – 0 or 1.
Link to the full article (in Polish)
Related articles on my blog:
– Only God can count that fast – the world of quantum computing
– Machine Learning. Computers coming of age
– According to our computers … You don’t exist
– What a machine will think when it looks us in the eye?
– Fall of the hierarchy. Who really rules in your company?
– Blockchain has a potential to upend the key pillars of our society
I think it is very dangerous to package the gate model of quantum computing in with the things that quantum effects can actually accomplish. Using statistical measurements quantum particles to secure a communications channel, generating random numbers, etc. Both are extremely important for cryptography. Adiabatic optimization is also a promising field, but we still haven’t gotten to a point where we can prove that it works better than regular optimizers. What is scary is how the gate model – the most flexible of quantum technology but the least likely to be feasibly implemented – is packaged in with all these other things. You’re packaging two proven successes with one likely success with one not-yet success. Basically, you’re doing with ideas what Wall Street did with mortgage-backed securities – stuff the thing full of badness, and sprinkle in a little goodness.
Whereas classical bits can take only two values, 0 or 1, qubits can be in a superposition of being 0 and 1 at the same time. Moreover, qubits can be entangled with each other, leading to correlations over large distances that are much stronger than is possible with classical information. Qubits also cannot be copied, and any attempt to do so can be detected. This feature makes qubits well suited for security applications but at the same time makes the transmission of qubits require radically new concepts and technology. Rapid experimental progress in recent years has brought first rudimentary quantum networks within reach, highlighting the timeliness and need for a unified framework for quantum internet researchers.
Non-quantum algorithms that are used to analyze the data can predict the state of the grid, but as more and more phasor measurement units are deployed in the electrical network, we need faster algorithms. Quantum algorithms for data analysis have the potential to speed up the computations substantially in a theoretical sense, but great challenges remain in achieving quantum computers that can process such large amounts of data.
Chemistry is one of the first commercially lucrative applications for a variety of reasons. Researchers hope to discover more energy-efficient materials to be used in batteries or solar panels. There are also environmental benefits: about two percent of the world’s energy supply goes toward fertilizer production, which is known to be grossly inefficient and could be improved by sophisticated chemical analysis.
No, exactly the opposite. Quantum computers can solve problems that classical computers can only solve in very restricted cases (small data sets). As the data set grows, it quickly becomes impossible for even the largest and fastest computers on Earth (or even the largest computers possible to build in practice).
These problems are possible to solve for classical computers, but only in a restricted sense.
The main advantages of a quantum communication network over a conventional one are speed and security. Entanglement makes it possible to communicate instantly across arbitrarily large distances in principle. No matter how far apart you put two entangled qubits, acting on one will have an instant and measurable impact on the other.
IBM Just Broke The Record of Simulating Chemistry With a Quantum Computer
Last year, Google engineers simulated the bonding of a pair of hydrogen atoms on its own quantum computer, demonstrating a proof of principle in the complex modelling of the simplest arrangement of energies in molecules.
Molecular simulations aren’t revolutionary on their own – classical computers are capable of some pretty detailed models that can involve far more than three atoms.
But even our biggest supercomputers can quickly struggle with the exponential nature of keeping track of quantum interactions of each new electron involved in a molecule’s bonds, something which is a walk in the park for a quantum computer.
So some problems, even though they may seem very simple, are inherently exponential. Whenever we add a little bit to the problem size, the time it takes multiplies. For most of these kinds of problems, that means we quickly reach a point where it’s physically impossible for us to compute an answer.
Quantum computers are designed specifically for modelling probabilistic processes – which is how much of the real world functions – the water doesn’t just ripple because of the wind (Which is itself probabilistic) but also gravitational perturbations caused by the magma, and tectonic meanderings of the earth below.
With a traditional computer you have to know all of the inputs to model the math for each step in the process – with probabilistic computing you can train a model based on known sample space – and get a pretty decent reflection because you are actually using natural probabilistic processes to calculate natural probabilistic processes.
Current cryptography depends on math problems that might take too long for anyone to even bother wasting computer resources on, but they’re solvable. And the bad news is that quantum computers with 1,000-qubit power could solve them in moments, not ages. The current public- and private-key encryption and digital signatures are based on megabyte-size algorithms. (search for something called Shor’s Algorithm.)
IBM sees quantum computing going mainstream within five years
IBM predicts five technologies that will change the world in the next five years.
From quantum computing to “unbiased” artificial intelligence, CNBC breaks down what the predictions will mean.
Development of quantum memory will allow much more complicated communication protocols that require quantum information to be stored while further communication goes on. This is a major challenge, though, because quantum states rapidly degrade through a process called decoherence. Most technology proposals only hold their states for seconds or fractions of a second, which poses problems for a network whose communication times are longer than that.