Jeff Bezos recently said about Artificial Intelligence: “It's probably hard to overstate how big an impact it's going to have on society over the next 20 years.” The field is still in its early innings; often brute force approaches to problem solving are used but it has a huge runway ahead of it.
Due to the confluence of several developments, AI has switched to exponential growth with disruptive potential - not utilising it will have detrimental effects to organisations foolhardy to ignore it. Software is eating the world and the first disruptions of Artificial Intelligence are already discernible.
“The number 73 marks the hour of your downfall” was the oracle that Delphi gave to the Roman Emperor Nero. He had just turned 30 and concluded he would reign long and die at 73. His reign however came to a sudden end in the next year after a revolt by Galba, a 73 years old man.
What would the Oracle predict today about the impact of Artificial Intelligence (AI) -Silicon Valley’s ‘new new thing’- on your industry and your company? Are you perhaps feeling comfortable about your competitive position today, like Nero did? After all, your organisation’s knowledge is not artificial; it is tangible and hard to copy.
The subject of AI has fascinated humans for many decades but the results had been underwhelming until recently. However, in the course of the last five years the field has advanced enough for practical applications with broad appeal to all consumers. Intelligent agents - Apple’s Siri, Google Now, Microsoft’s Cortana, Amazon’s Alexa- and services such as the fascinatingly clever Google Photos. But AI is not just an opportunity for the tech industry and a benefit to its users: “software is eating the world” and AI will be one of the next predators! Not utilising it will have detrimental effects to those foolhardy to ignore it.
Look at self driving vehicles. This is not just a Google whim and moreover, it will have an enormous impact on the car industry. Self-driving is not just about mounting some hardware sensors and actuators to an existing vehicle, already equipped with GPS and cartography. AI plays an incredibly important role in making it all work. Look at this picture, illustrating the vast amount of data processed and -more importantly- interpreted intelligently, in real time.
Bill Gross, founder of Idealab, posted this picture recently and added: “I learned this weekend at an XPrize event that Google's self-driving car gathers 750 megabytes of sensor data per SECOND! That is just mind-boggling to me. Here is a picture of what the car "sees" while it is driving and about to make a left turn. It is capturing every single thing that it sees moving - cars, trucks, birds, rolling balls, dropped cigarette butts, and fusing all that together to make its decisions while driving. If it sees a cigarette butt, it knows a person might be creeping out from between cars. If it sees a rolling ball it knows a child might run out from a driveway. I am truly stunned by how impressive an achievement this is.” (Italics added)
And it will not be just products and manufacturing supply chains that will be impacted by AI. Many services industries will experience similar disruptions. Remember IBM’s Watson winning Jeopardy in 2011? Fast forward and look at Amelia, ‘your first digital employee’. It was developed by IPSoft, a company we have worked with that pioneered AI in IT services. The Amelia-platform can understand, learn and interact to solve problems. ‘She’ reads natural language, understands context, applies logic, infers implications, learns through experience and even senses emotions - understands what is meant, not simply what is said. Furthermore, ‘Amelia’ becomes an expert, capable of reading and digesting the same training information as human ‘colleagues’ quickly and learning from interactions faster.
The potential and likely impact? Imagine service desk tasks, procurement processing activities, claims processing work in insurance companies or expert advisory roles to field service engineers, to lawyers, to financial services professionals... 24/7, all year long, reliably and compliant. This is not science fiction; version 2.0 was released in October 2015.
We should note that brute force approaches to solving problems are a part of the solution compensating for the fact that current AI algorithms learn and operate much less efficient than we humans do. For example, Google’s AlphaGo computer beating the world champion of Go requires thousands of times more power than the human brain. Nevertheless, we are convinced that the field of AI is in its early innings and has a huge runway ahead of it. AI has switched definitively from linear to exponential growth due to the confluence of a number of technology developments and trends:
- Much of the recent progress is based on improving the effectiveness of the Deep Learning algorithms that are being used. Open sourcing these algorithms is delivering notable improvements at an accelerated pace. Google has open sourced the development of its deep learning algorithms that were used in Google Photo, and recently Amazon has gone open source as well.
- These ‘learning’ algorithms require training data to get good at their task; the more data you feed them, the better they get. The availability of powerful public cloud hardware has risen sharply allowing for faster processing -hence learning- at much lower cost.
- The collection of massive amounts of training data has become easier and faster due to the enormously increased use of connected sensors (smartphones, the Internet of Things). Also, Amazon’s ‘Mechanical Turk’ -a crowdsourcing marketplace to use human intelligence for tasks that computers are currently unable to do- has accelerated learning by producing significantly more training data at a higher pace.
- With Moore’s law, chips are running the AI algorithms faster, cheaper and more energy efficient all the time. Despite the mounting challenges related to the ongoing miniaturisation -raising questions about the sustainability of Moore’s law- the chip industry continues to advance. NVIDIA, a leading GPU -graphics processing units- designer recently launched a new graphics chip optimised for AI, holding 3 times more transistors than the previous generation, enabling learning 12 times as fast.
- Instead of using generic chips to run the AI algorithms we also see tailor made hardware turbo charging progress: earlier this year, research at MIT produced a different chip designed specifically for neural networks which is “10 times as energy efficient as a mobile GPU”. This means that mobile devices will be able to run the powerful AI algorithms locally and even in the absence of connectivity, rather than having to upload the data to the internet for central processing, interpretation and response. Also IBM is introducing radically new and more energy efficient designs, different from the Von Neumann architecture principles which have dictated the hardware industry since its beginnings.
- Other big step improvements in AI performance are driven by using different algorithms (smarter) rather than by just calculating a lot faster. An example is VoiceIQ, a company acquired by Apple last year, which has developed a smarter algorithm with a steeper learning curve: “Siri brings in 1 billion queries per week from users to help it get better. But VocalIQ was able to learn with just a few thousand queries and still beat Siri.” It also understands and remembers context. VoiceIQ’s AI requires orders of magnitude less training, producing more effective results faster and easier.
- The big tech companies are exposing their AI algorithms to external parties via programmable interfaces (API’s). This implies that everyone can get access to already existing, state of the art algorithms and connect them to their systems. You can put them to use for your specific business situation as a configurable functionality and train them with your data. For example, Amazon’s Alexa has two ‘software development kits’: one to embed the AI voice recognition capabilities and one to teach Alexa a new “skill”. And those two things work together. A very important additional ‘side effect’ for enterprise is the increased size and scope of the development community. To illustrate this point: SAP’s ‘corporate closed community’ has about 2.5 million developers whereas Apple’s iOS ‘open community’ has roughly 10 million developers - and it hasn’t been around for nearly as long.
We close by quoting Jeff Bezos; at the 2016 Code Conference he commented on AI: “I think it's gigantic — natural language understanding, machine learning in general. [...] It's probably hard to overstate how big an impact it's going to have on society over the next 20 years. [...] Amazon spent working on Alexa four years behind the scenes and more than 1,000 people are working on it. [...] And there's so much more to come. It's just the tip of the iceberg.”
In the interview at Start Up Fest Europe -referred to in an earlier post- Eric Schmidt talks about AI (time: 3:16-5:40).