FORBES: Will a quantum computer become a curse of cryptocurrency?

Theoretical foundations of quantum computers were described in the 1980s of the last century. It is only recently that technology has caught up with theory, allowing it to build the first quantum computers. Will their appearance cause the blockchain to become obsolete?

Share

facebook twitter google+ linkedin email
Forbes Czy komputer kwantowy stanie się przekleństwem kryptowalut Norbert Biedrzycki

My article in FORBES on March 13, 2018.

 

An unquestionable pioneer of technology behind quantum computers is the Canadian company D-Wave Systems. Its clients include both government agencies such as CIA, NSA, science centers, NASA, as well as commercial entities, such as Google and Lockheed Martin – you can read on their website. The European Union also intends to spend a billion euros on similar research. Technological companies develop their own technologies, seeing the wide applications for huge powers dormant in calculations based on quanta, the basic particles from which our world is built.

 

The electrons are too crowded. Moore’s Law

Why is this technology so breakthrough? Modern processors consist of billions of transistors of several nanometers, grouped on a very small surface. According to Moore’s law, the number of transistors in the microprocessor doubles every two years or so. Unfortunately, the increase in computing power in processors is constantly slowing down. We achieve technological boundaries in the possibilities of “packing” more and more transistors on such small surfaces. The limit, which can not be physically crossed, is a single-atom size transistor and a single electron used to switch its state from 0 to 1.

 

Technology from a cold vacuum

The advantages of a quantum computer can be shown simply by comparing it to a traditional computer. The device, which we know from everyday work, performs all operations, using basic IT units such as bits. And these can in principle only represent two states: 0 and 1.

In the case of a quantum computer, we talk about the use of an intermediate state, i.e. going beyond the scheme of two opposite values. Kubit (from quantum bits) – because it is called a unit of quantum devices – can simultaneously take the value of 0 and 1, and exactly can take an infinite number of states between 0 and 1. This state is called a superposition. Only when checking the value of qubit, it takes one of two basic states – 0 or 1.

 

Link to the full article (in Polish)

 

Related articles on my blog:

Only God can count that fast – the world of quantum computing

Machine Learning. Computers coming of age

According to our computers … You don’t exist

What a machine will think when it looks us in the eye?

Fall of the hierarchy. Who really rules in your company?

Blockchain has a potential to upend the key pillars of our society

 

Leave a Reply

3 comments

  1. TomK

    IBM Just Broke The Record of Simulating Chemistry With a Quantum Computer

    Last year, Google engineers simulated the bonding of a pair of hydrogen atoms on its own quantum computer, demonstrating a proof of principle in the complex modelling of the simplest arrangement of energies in molecules.

    Molecular simulations aren’t revolutionary on their own – classical computers are capable of some pretty detailed models that can involve far more than three atoms.

    But even our biggest supercomputers can quickly struggle with the exponential nature of keeping track of quantum interactions of each new electron involved in a molecule’s bonds, something which is a walk in the park for a quantum computer.

    https://futurism.com/ibm-just-broke-the-record-of-simulating-chemistry-with-a-quantum-computer/

  2. Tom Jonezz

    Current cryptography depends on math problems that might take too long for anyone to even bother wasting computer resources on, but they’re solvable. And the bad news is that quantum computers with 1,000-qubit power could solve them in moments, not ages. The current public- and private-key encryption and digital signatures are based on megabyte-size algorithms. (search for something called Shor’s Algorithm.)