Last year saw a breakthrough in the spread of the notion of Artificial Intelligence. The term was frequently mentioned in the media, attracting growing attention, and not without a reason. Much of the hype resulted from the activities of business gurus like Elon Musk and Mark Zuckerberg, as well as from the research being conducted by Google and new applications of virtual and augmented reality. As a consequence, this once somewhat abstract concept has been transformed into tangible solutions: new devices and services that change people’s lives.
What we saw in 2016 is only the start of a series of events that are bound to unfold in years to come. We will inevitably witness artificial intelligence spreading into all areas of life. We are in for a number of fascinating developments associated with the rise of autonomous transportation, quantum computers, supercomputers and innovative applications. The technological revolution will continue to change the ground rules of business.
Outlined below are some of the key technological trends that may dominate digital technology in 2017. I have divided them into the four main segments:
Artificial Intelligence as a foundation for key technologies
The term ‘Artificial Intelligence’ has made an incredible media career. This is no wonder considering how much it gives us to think and talk about. Prepare to see it gain even more publicity this year.
Algorithms enabling computers to self-learn and make decisions, programs capable of replicating selected mental functions, as well as reading and interpreting texts and even understanding natural language. These all underpin artificial intelligence. I think we should take note of three developments that may soon propel technology to make an enormous leap forward on the back of AI.
Endowed with ever increasing computational power, computers are beginning to embrace processes that mimic human cognition. Work is under way to develop quantum computers whose data processing capacities will be many time those of today’s binary machines. Intel Labs, IBM and Google Quantum A.I. Lab are all seeking to emulate human brain functions. The key presumption is that the simulated systems will operate faster than the brain itself. While neurons transmit information at approximately 150 meters per second, optical fiber can do it nearly 2 million times faster. The pivotal observation is that by analyzing large data, today’s machines will be able to learn and self-improve. The best-known example is IBM Watson, a computer capable of answering medical and economic questions with increasing accuracy.
Path to Cognitive Computing. Source: IBM
Linked inextricably to “thinking” computers is machines’ ability to learn. Modeled on neural networks, this ability enables computers to predict trends. What this means for the financial world is, among other things, a chance to base future transaction models on real-time analyses. According to a report by McKinsey & Company, a dozen plus European banks resolved in 2016 to replace traditional statistical models with self-learning machines. Marketing is another area posed to benefit from machine learning in designing promotions and selecting sales channels. Even today, self-learning machines make recommendations to online shoppers, allowing websites to respond to customer behaviors in real time, and ultimately, maximize sales. The implications of advances in this field are going to become increasingly evident.
Machine Learning – process description based on neural networks.
Natural Language Understanding
This trend may unfold in a number of ways. Natural language understanding comes into play when, for instance, a robot receives and responds to a voice instruction. The term encompasses all issues associated with having machines read and comprehend complex texts, such as press articles and reports. For corporate workers, having computers read e-mails and arrange them in a prescribed order offers much help in their daily work. One can expect a slew of new companies investing heavily in services that rely on natural language understanding technology. Even today, machines can be instructed to produce reports on any topic (as is done by, for instance, Associated Press) or prepare a summary based on data supplied in a variety of formats (charts, infographics, numbers)
Time for independent machines
It is fascinating to see computers thinking ever more efficiently, interpreting gigantic data sets, drawing conclusions and offering independent interpretations. As reluctant as I am to join the doomsayers who claim that humanity is going to perish under machine domination, I admit that game-changing technology is looming ahead, and the way it threatens to make all things different makes me a bit uneasy. For the time being, I am confident this year will bring a number of interesting developments proving that computers have made a great deal of headway towards becoming “mentally” independent.