The coming of age for artificial intelligence

Richard Windsor, Radio Free Mobile
London 51° 30' 23.112" N, -0° 7' 37.956" E

While artificial intelligence (AI) has been central to sci-fi films for more than 50 years, it has not lived up to the hype in the real world. But now, with unprecedented interest and investment, AI is coming of age.

The AI-like technology that has seemingly been around for some time, is really more like advanced statistics. It can provide analysis of big data, but cannot drive autonomous cars or create deep, intuitive and rich applications and services. In what we are beginning to see, however, the promise of these possibilities (and more) is getting closer.

Changing AI seasons

Nicknamed "AI Winter," during the decades of the 1980s and '90s, sentiment around AI was so bad, researchers did not want to be associated with it. With the current acceleration of the evolution, AI winter is thawing and giving way to a blossoming AI spring.

The catalyst for the rebirth has been other technological innovations. For one, AI almost exclusively resides in the cloud, and recent infrastructure advances have made high speed and low latency data collection possible, resulting in stronger processing capabilities.

Plus, with users performing almost everything on their mobile devices, AI can finally generate value for consumers, and they are now willing to pay for digital life services.

Most important, the amount, availability and quality of data -- the lifeblood of AI -- has increased significantly in the recent years.

These three developments have created a fundamental shift in AI, so we are seeing a light at the end of the tunnel and a chance for AI investments to finally provide return.

Summer heat

All this said, there is still much work to be done to get to the next season. In order to create the sophisticated algorithms that are the basis of AI and reach "AI summer," we have to reach three goals.

First, while the generation and collection of massive amounts of data have enabled great progress in AI, data for data's sake is of no use. To enable the creation of sophisticated algorithms, data science needs to be applied to extract the relevant and valuable information.

The second goal is enabling more general purpose AI. Today, the technology is limited to performing one specific task, making AI cumbersome and pretty dumb as a result. We need to train for one task but then build capabilities that enable AI to apply what was learnt to a second task, and so on.

Automated models are the third goal. Once an AI algorithm is formulated, it needs to be tested, which is a time-consuming process that calls for highly-skilled people. So, in order to quickly develop AI, we need to teach machines to build models themselves.

Coming of age

At this stage, AI would be considered coming of age, bringing to fruition capabilities like autonomous cars and AI personal assistants. However, I believe these advances, and ones we've yet to imagine, are still many years away. Developing the necessary ability for AI to learn, will require additional development in other areas.

Capturing the right data, gaining the appropriate insights and working out how to apply them is still challenging; yet being able to draw meaningful conclusions is the next level of AI.

For example, a photo with lots of blue and white might be identified as someone skiing; recognition is only one facet. Ultimately, AI would know what to do next; perhaps in this case, open a travel app.

At this point, digital life services will be so common, they'll be near commoditized. So, the key differentiator in the third stage of AI would be would be to make services richer, deeper and more intuitive.

Will a fourth stage find AI intelligence surpassing our own? At the current rate, possibly by 2040, but don't worry about machines taking over. It will become prohibitively expensive and difficult to maintain Moore’s Law. I don’t think there’s any risk from AI for a very long time.

Topics: Artificial Intelligence, Autonomous cars, Autonomous World, Big Data, Features

Comments