Technology 2020. Algorithms in the cloud, food from printers and microscopes in our bodies

The trends I am describing are not isolated, meaning that growth in any given area, will have a spill-over effect and feed into other trends, including those left out from this year’s selection, such as the Internet of Things, autonomous vehicles and blockchain.

Technology trends 2020 Norbert Biedrzycki blog

I have a growing sense – which I think I share with many readers – that people are getting used to hearing about artificial intelligence. Neural networks are making themselves at home in our businesses and households as they either creep into them discreetly or splash in spectacularly. We no longer speculate about what its advent may mean. Instead, we focus on more specific applications and benefits. The changes seen over the past few years are extraordinary. Not only in the technology that surrounds us but also in our economies and ways of life. For instance, when I started writing this blog in 2016, the sharing economy was an idea that may have been appealing to many but that remained highly hypothetical. Today, as I go out, it takes me seconds to use an application to locate a shared car that I can pay for by the hour, sitting around the corner, and to start driving. This is just one example from daily life. I could name many more from the world of business. What 2020 brings?

What follows is a list of subjectively selected trends that I think will take off in 2020. It is difficult to say which of them will be the most successful and what that success might be. The trends I am describing are not isolated, meaning that growth in any given area, such as 5G technology, will have a spill-over effect and feed into other trends, including those left out from this year’s selection, such as the Internet of Things, autonomous vehicles and blockchain. The reason they were omitted is not because these technologies are any less significant or stagnant but rather because I have discussed them many times before.

Artificial intelligence taking to the cloud

Machine learning, natural language processing, and image recognition are becoming increasingly familiar to sales, marketing and IT. Debates on whether to take advantage of artificial intelligence have grown more substantive, which naturally entails questions about money. Only the largest players can afford to develop their own systems. Hence, a niche has emerged of machine learning platforms made available on a shoestring. Virtual tools allow companies to choose what data and resources they want to process with algorithms and when to do it. Graphical user interfaces support access to many functions on their platforms. Even without high technical skills, workers can test AI performance in their fields. The cloud AI segment is growing by leaps and bounds. The heyday of this service is still ahead. Less affluent businesses can test AI in relative comfort. In 2020 the cloud is again changing the face of the market.

2020 tech trends Norbert Biedrzycki blog Cloud

In 2020 the world belongs to Big Brother

Computer image analysis and facial recognition have become all the rage. Every day, machines learn to see things better, which we can notice by for instance logging into a smartphone that runs a facial scan. By all indications, such scanning will soon be the norm for shoppers, rendering card readers obsolete. In China, such scanning is already commonplace. As machines improve their vision, robotics is inching closer to a major breakthrough. Robots capable of making sense of their immediate surroundings will better integrate with human environments. Better image analysis can also accelerate advances in autonomous vehicles and streamline the detection of defects in devices produced on assembly lines.

Needless to say, we are aware of the risks involved in such development. Image-recognition algorithms are imperfect; the presence of cameras on city streets and in schools and office buildings is sparking protests. Some US and French cities have already begun taking them down. The recognition of people’s images in social media is also an issue. Hence, proper regulation is necessary. Any technology that is either misused or used recklessly can generate social tensions and lead to abuse. This said, I firmly believe that machines that can see better are the future of our civilization.

Awaiting the arrival of a great network

5G technology can alter communication standards and create a new quality in data transfers. The consequences of this for the growth of industry, research and regular consumers of electronic content are indeed profound. Equipped with ultra-fast transmission networks, such consumers will be free to stream top quality movies and music. If everything goes to plan, for the first time in the history of mobile networks, location will no longer constrain people. In practical terms, this may mean the rise of an unlimited global transmission network. The consequences of this for business are enormous. Such robust data transfer technology may well fuel the growth of many other technologies and projects that now remain dormant, waiting for transmission channels to be unlocked. One example is the Internet of Things, which hooks up a wide range of digital devices to a global network, including phones, TV, cars, household appliances, cameras, robots and all kinds of electronic industrial equipment. If 5G develops as predicted in 2020, we will enter a new technological age characterized by unconstrained communication among various devices.

2020 tech trends Norbert Biedrzycki blog NLP

Seduced by machine eloquence

I think that the next few years may see a breakthrough in devices’ ability to understand human voice. Our interactions with bots are poised to change. Simple commands and questions are gradually replaced with more natural conversation. For this to happen, the voice-operated device must be more than a mere programmed machine that performs a few isolated tasks. Semantic Machines, a Microsoft-owned company, is doing research on conversational intelligence. The intention is for devices that respond to the human voice to be able to comprehend complex contexts, i.e. process information streams from multiple sources. The research gives hope of producing voice assistants that link situational contexts and meanings with emotions and anticipate interlocutor intentions. If you tell your assistant to book a theater ticket, it should revert by asking you what time would suit you, whether you want to get there by metro, and whether you are planning to eat out after the performance. The assistant becomes proactive and capable of expressing complex meanings. The reports on the trend of “Alexization” of our lives, named after the popular voice assistant Alexa, may not be greatly exaggerated. Today, Alexa is already connected to over 80,000 digital devices of various types ranging from home appliances to all kinds of consumer electronics. Other assistants include Google Home and Apple HomePod. If these devices master the art of conversation, their presence in our lives will become as significant as Google searches are today.

2020. Robots learning to be versatile

A machine that could move objects over distances would significantly reduce delivery time. But to be accepted in a business environment, it will have to woo people with additional skills. And it soon will. New-generation robots have technologies for processing visual data and understanding what happens around them. The effect can be impressive: a robot capable of navigating its way neatly around employees, avoiding obstacles, recognizing floor surface anomalies and, importantly, taking in a growing volume of data to bolster its “brain”. This portends a breakthrough in robotization, promising to overcome the two barriers that have had significantly slowed their conquest of the business world. It will no longer be necessary to keep people and machines in strictly segregated zones because robots will be able to yield to humans when needed. The skill most in demand is for robots to learn how to perform many tasks simultaneously. While today’s machines are greatly limited in their ability to multitask, their counterparts that excel in that skill are looming. As robots continue to benefit from advances in voice processing, their communication skills will explode allowing them to blend seamlessly into the human environment. And one more thing: agility. The experimental robots manufactured by Boston Dynamics are outstandingly good gymnasts, and yet certain human activities remain beyond their grasp. Even here, however, there has been some progress. Robots are currently in intensive training to learn to accurately recognize the shape of the objects they hold.

Will the robots in 2020 be able to take an order, find goods on a shelf, place them in a container and move them to a postal machine? Affirmative on all counts.

Sharing: Connect rather than own.

The business model that has companies earning money by sharing their unutilized resources has been around for over a decade. In the early days, initiatives such as BlablaCar, Couchsurfing, and Airbnb were viewed as startup experiments tapping into the huge potential of communities. These initiatives have by now showed their inherent business value. Interestingly, the trend has evolved over the years. As it lost its luster of alternative novelty, it became a serious innovative way to turn a profit, invest money and manufacture goods. Fund financing fueled the trend, resulting in today’s popularity of scooters, cars, bicycles, and apartments rented for short durations. However, it is impossible to ignore the disruptive upshot of the phenomenon. “Sharing”, which started out as a glorification of social bonds, turned into “uberization”, with all its downsides. This is not only about Uber’s effects on the taxi business and controversies over drivers’ pay. It also concerns the disruption of property rental rates in big city centers caused by the mass purchasing of apartments for community letting. We have come to a point where business on technological steroids requires regulation. If laws can gradually be passed, the trend will grow on. We will find we can share anything: luxury clothes, computers, photographic equipment, cars, airplanes and virtually any services. As it turns out, connecting instead of owning doesn’t only appeal to the young. It is a reasonable strategy for cost reduction in almost every business. Building your own company in which all equipment is paid by subscription? Why not? And why not in 2020?

2020 tech trends Norbert Biedrzycki blog med

Algorithms scan our bodies

Whether AI can prolong our lives and improve its quality is no longer philosophical speculation. The benefits of machine learning with its predictive functions that support the analysis of infinite data combinations can be seen in hospitals and medical laboratories around the world. A real, profound change is unfolding before our very eyes, not only in the way we treat and prevent diseases but also in the way we tap into the unutilized potential of our bodies. In the near future, an entire industry will provide the average person with a range of tools to observe the performance of their bodies. Applications will emerge that will take our EKG and detect early symptoms of serious diseases. Pharmaceutical companies will use artificial intelligence to dramatically accelerate processing the kinds of data sets that have so far required years of painstaking analysis. Algorithms capable of combining an infinite amount of information about age, medical history, current results of laboratory tests and physical examinations, image analysis, and our DNA, will be available not only to specialized laboratories but also to individual doctors. Swallowed capsules containing miniature microscopes will offer insights into our bodies. The data sourced in this way will go to digital files along with data from our personal medical wristbands, and DNA test results that will become ever easier to interpret. As such files fill up with continuous data streams, self-learning machines – which work better on large data collections – will deliver a more precise picture of our bodies and therefore ourselves.

2020. Printers to replace farms

One of the saddest paradoxes of our civilization is that the way the fundamental commodity, which is food, is produced violates ethical principles, causing environmental degradation and reinforcing bad eating habits. This applies primarily to the mass production of meat. And although estimates show that meat consumption will increase by up to 70 percent in the next two decades, our awareness of the woes this will cause is growing. Tech companies and startups follow the trend of ethical and healthy food, join the debate and propose innovative solutions. Over the past two years, there has been much talk of an initiative financed by Bill Gates of providing an alternative to the mainstream approach to meat production and consumption. The initiative involves Beyond Meat and Impossible Foods, two companies that offer healthier food and show that mass production of food may well be in line with sustainable development. Technology provides three options. We can grow meat in a laboratory using animal stem cells. We can also produce and share, on a massive scale, an entirely plant-based product that tastes just like meat. Vegan burgers have a strong foothold in the menus of McDonalds and Burger King. There is also another way that is being tested by startups. An example is the Israeli company Redefine Meat, which has recently unveiled a burger produced in a 3D printer. While this last proposal may strike you as a bit odd, technological advances can surprise you and change both views and habits. Only one thing is certain: in 2020 it is absolutely imperative to invest in innovative food production. As funding streams in, one can only expect the trend to continue for years to come.

2020 tech trends Norbert Biedrzycki blog 3D

Link to Norbert’s blog.

Related articles:

– Artificial intelligence is a new electricity

– Machine, when you will become closer to me?

– Will a basic income guarantee be necessary when machines take our jobs? 

– Can machines tell right from wrong?

– Machine Learning. Computers coming of age

– The brain – the device that becomes obsolete

Leave a Reply

23 comments

  1. Laurent Denaris

    Considering the different approaches with Aat Intelligence , we keep assuming we are going to have intelligencies like us, I doubt it. I really wonder if they will be as independent and autonomous as we imagine too. If anything as far as processing information it could be AI versus/ working with augmented humans.

  2. Tom Jonezz

    Unfortunately with advances in AI and Robotics, labor strikes will have less and less impact on output and production – so, we are fu….ed

  3. TomCat

    Private corporations are a very small individual entity in terms of percentage of the population. True, they do bribe our congress right now, but still, they are only a vanishingly small percent of the population.

    Given this comparison, how do you see vicious force being used? Is google/facebook/amazon/apple going to be rich enough so that they hire their own mercenary forces to attack those against their interests? Because you can’t really use politics to get the police and army to use force against your enemies on a 100% certain basis. Because these forces are controlled by the president or local or state governments. You’d need your own military force that reports directly to the company for 100% certainty that they do what you want them to do, when you want them to do it.

  4. JackC

    To demolish fundamentals you have to first “spot” them out, even though they don’t exist. And many people choose to believe so-called science, the “gay gene” thing is one example to believe in “essence”, although it’s not true. So it definitely fosters public awareness in terms of cultural revision.

  5. tom lee

    I donno about artificial *general* intelligence but I feel like the main thing holding back the world of machine learning is data. We have plenty of data. But is it the right data?

    In the last year both OpenAI and DeepMind have achieved some absolutely spectacular results in developing and training deep reinforcement learning agents that can play real time strategy games (Dota2 and Starcraft2 ) with super human ability.

    And I feel like those achievements accurately depict where “Artificial Intelligence” (imho it’d be more intellectually honest to call it linear algebra powered statistics. but i don’t work in marketing) is at right now. If you give a deep neural network all the relevant data from the “environment” it needs to approximate a pattern it will, with many technical caveats, learn to model that pattern. On a theoretical level neural networks are universal function approximators, and while there will always be a wide gulf between theory and practice the defeat of Dota2 and Starcraft professional gamers at the hands of deep learning models prove this isn’t all hogwash, they really can model massively complex environments.

    • John Accural

      So you are saying it’s ok to start killing the psychopaths who do create this system?
      I do not have social media. No fb. No instagram. I use reddit for meme’s. I use a fake email that is in no way connected to me IRL except being on my phone. I do not consent to my privacy being stripped. Though I’m sure the slaves of the past did not consent to servitude.
      I do not have time to read the pages upon pages of “terms of use” further more when I can choose to opt out I literally have no way of knowing if they are respecting my wishes.
      This seems like the person is condoning war.

    • Laurent Denaris

      The fear of AI is not that it will take over but that they are programmed by people and I believe will therefor have the same flaws as people. I don’t fear an AI overlord but economic devastation cause by an AI that made a decision and crashed the market, once that happens, other AI will react and make it worse?! It’s just an example, but it’s what I fear.

  6. Adam Spark Two

    The inability to predict exponential processes just goes with the territory. Solving Go was 10-20 years out until it suddenly wasn’t. Cars are the same. There appear to be insurmountable obstacles that will remain in appearance that way until suddenly surmounted, and none of the researchers not working on that surmounting will know it’s happening. Very often even the researchers doing the work don’t know their on the golden path till very far along.
    It is the nature of such things. My problem comes in when these arguments are used to therefore do nothing, not worry. They ignore that there’s risk that cars will be driving autonomously sooner than they think, and politically, economically, we’re waiting until its for sure coming before beginning to think about the impacts of that.
    I liken it to trying to time the market. Don’t do that, the advice goes. It’s good advice. Likewise, let’s not try to time important AI advances. Rather, let’s get our ducks in a row before it’s sprung on us.

  7. Mac McFisher

    But raw processing power is only one small part of the problem. How do you program personality? How do you abstract the process of abstract thinking itself? We don’t even fully understand how those things work in humans, so how could we possibly recreate them in software?
    Personally, I don’t think we’ll be anywhere near having a human-like AI until we have a MUCH more detailed understanding of human cognition.

  8. Jack666

    Boy, ain’t they fun? Take a look at markov models for even more matrices, I’m doing an on-line machine learning course at the moment and one of our first lectures was covering using eigenvectors for stationary points in page rank. Eigenvectors and comp sci was not something I was expecting (outside of something like graphics)

  9. John Accural

    Im assuming you mean the “beeing able to learn” AI, not the scripted AI, e.g. the way Alexa works atm.
    AI will get more integrated in things like childrens toys, computer games. Digital assistants will get better at actually learning from me, atm the companies more watch and adjust it globally, e.g. in future Alexa will understand that when I ask “how cold is it ouside” I dont want to know the 24h min temp, but the current weather.
    I dont think there will be the huge jump where we have commander Data, or Skippy the Magnificent but it will improve.
    Id say cs class in schools would ad ai, but when I think about what cs class is today, and the speed at which politicians actually adapt to new technology and developments, I’d say 5 years isnt enough for that

  10. Mac McFisher

    Artificial people is, in my opinion, still a dream. AI (or deep learning for that matter) is still very fragmented. We have local systems for a given task (or set of tasks). Trying to unify it under a single system is still quite difficult. Some unifying systems (for the task of Visual Question Answering) [1] are out there. Also, some good ideas behind giving the system the ability to imagine scenarios [2], but being able to stitch all of these into a truly generic AI is something I think we do not have the ability to produce yet (mostly because it would be a monstrous model to train). My sincere bet is on quantum computing + deep learning. If we manage to have a system that exploits the quantum way of doing things to be able to train a Deep model, we can start testing architectures that would otherwise be very unstable and long to train.

    • Zoeba Jones

      Pretty bad analogy:
      – I still have and can use my data after I give a company a copy of some of it.
      – I don’t give away all of my data, or the most important parts of it.
      – I’m creating new private data every day.
      – I can deliberately change some of my data, after I’ve given copies of it, such as by changing email address or phone number or postal address etc.
      – I could generate fake data if wanted to, trying to poison the old data that’s already out there.

  11. Simon GEE

    This is such a good question. I wonder what their approach will be, it can’t be possible to focus on the statistical methods, which will be insane to try and regulate. IMO it should be focused on what data an organisation has access to, and who get’s access to their findings. But that is already considered and it’s not like we can really control what google and the like actually decide to do at this point.

  12. CaffD

    One question, Norbert. Why do you think the universe is random ? Number of factors explaining why life on earth we know is possible and why we can’t observe it in the known parts of the universe is extreme.

  13. Oniwaban

    Norbert Biedrzycki interesting post. “Since self-learning machines improve over time, assistant devices will become increasingly more competent.” Do you think that this is a matter of 2-3 years or we need to wait like 5-10 years for bots’ improvement? I used bots in a financial industry (as a customer) several times and it always ended with a need of human support.

    • Simon GEE

      Any time a CEO calls to be regulated, its because they want a seat at the regulatory table to regulate what ever is being regulated, in such a way that it acts as protectionism for the CEO’s company.

    • Adam Spark Two

      I think it’s easy to do that because they landed on a paradim of development circa 2010 that has undergone iterative improvement, but the range of predictability on that has been short and highly focused, nor does it translate to much else. I don’t think it’s due to the numbers of researchers. In other words, having thousands of researchers doesn’t beget this predictable iterative improvement, rather predictable iterative improvement begets thousands of researchers jumping on it.
      Periodically we’ll get these short-leved paths, where an innovation or new idea leads to a paradigm that can (relatively) briefly be iterated up, with scaling up of hardward, optimization of the basic technique, etc. Post Deepmind’s work with Go and then AlphaGo and then AlphaZero, you can see those techniques being iterated and applied to Starcraft and other games. If you wanted you could make some predictions about future game performance from that. I don’t think that changes my view on the overall predictability of most AI and certainly not AGI.

  14. Peter71

    Amazing! Lots of questions still to be asked. Thank you for sharing with us your knowledge Norbert. Have a happy holiday season! 🙂