IoT will be the next driver of AI innovation. By 2025, there will be 55 billion IoT devices (Business Insider Intelligence), and  Due to to latency, cost, privacy and connectivity issues, being able to analyze data at the edge where it’s created is critical because it improves the speed of analysis and decision-making.

Data analytics has generally relied on human-defined classifiers or “feature extractors” which are rules that can be as simple as a linear regression, to more complicated machine learning algorithms. But can you imagine building a human-defined perfect rule-based system to model everything?

It doesn’t take a fortune teller to see that AI on IoT is going to be where the next wave of innovation and opportunity is going to be.

Here’s an interesting piece in VentureBeat that explores the space and why it’s taking off now.

“We’re seeing things today that people have always seen in movies and dreamed of doing at home become ordinary everyday use cases for users on their smartphones,” says Jeff Gehlhaar, Vice President, Technology and Head of AI Software Platforms at Qualcomm Technologies, Inc.

That includes always-on capabilities, for instance, smartphone assistant features like voice wake-up, always-on noise suppression, language understanding, disambiguation of circumstance, or ability to hear and understand you at varying distances from your device’s speaker. It also powers on-demand, high-performance smartphone capabilities such as instantaneous language translation and more.

Nvidia CEO Jensen Huang holds up the Jetson Nano onstage during the GTC keynote address in San Jose, California — its smallest computer ever.

This new embedded computer in its Jetson line for developers deploying AI on the edge and the goal is to make them affordable.

The Jetson Nano developer kit is available today for $100, while the $129 Jetson Mini computer for embedded devices will be available in June.

I’ve long thought that the technology sometimes resembled the fashion industry in that trends come, go, and come back albeit in slightly differently. The recent rise of “edge computing” bears witness to this idea.

In this video, a16z partner Peter Levine takes us on a “crazy” tour of the history and future of cloud computing — from the constant turns between centralized to distributed computing, and even to his “Forrest Gump rule” of investing in these shifts.