Azure Data Box Edge is as a server class target for Azure IoT Edge.

Sometimes people want to run more heavy-weight workloads via IoT Edge than fit on traditional gateways. Data Box Edge offers a server-class machine to do so.

During this episode of the IoT Show, get introduced the device capabilities and show a short demo of how simple it is to setup and configure from the Cloud.

Learn more HERE

AI has become a major driver of edge computing adoption. The edge computing layer was originally only meant to deliver lower compute, storage and processing capabilities to IoT deployments. As well as for  sensitive data that could not be sent to the cloud for analysis and processing is also handled at the edge.

Here’s an overview of the players and the state of the edge computing art.

Three AI accelerators present on the market today are NVIDIA Jetson, Intel Movidius and Myriad Chips, and finally the Google Edge Tensor Processing Units. All three are highly optimized for edge pipeline workflow and will see an increase in usage over the coming years. As AI continues to become a key driver of the edge, the combination of hardware accelerators and software platforms is becoming important to run the models for inferencing. By accelerating AI inferencing, the edge will become an even more valuable tool and change the ML pipeline as we know it.

I’ve often referred to edge computing in many posts, but here’s a great article on why it will revolutionize IoT and help it really transform entire industries along with our everyday lives.

The edge is where data gets generated, events occur, things and people interact. The key is putting intelligence there. The Internet of Things (IoT) holds great promise for improving operational efficiencies and vastly reducing costly downtime. But for IoT to realize its potential, computational challenges must be overcome. Even […]

IoT will be the next driver of AI innovation. By 2025, there will be 55 billion IoT devices (Business Insider Intelligence), and  Due to to latency, cost, privacy and connectivity issues, being able to analyze data at the edge where it’s created is critical because it improves the speed of analysis and decision-making.

Data analytics has generally relied on human-defined classifiers or “feature extractors” which are rules that can be as simple as a linear regression, to more complicated machine learning algorithms. But can you imagine building a human-defined perfect rule-based system to model everything?

It doesn’t take a fortune teller to see that AI on IoT is going to be where the next wave of innovation and opportunity is going to be.

Here’s an interesting piece in VentureBeat that explores the space and why it’s taking off now.

“We’re seeing things today that people have always seen in movies and dreamed of doing at home become ordinary everyday use cases for users on their smartphones,” says Jeff Gehlhaar, Vice President, Technology and Head of AI Software Platforms at Qualcomm Technologies, Inc.

That includes always-on capabilities, for instance, smartphone assistant features like voice wake-up, always-on noise suppression, language understanding, disambiguation of circumstance, or ability to hear and understand you at varying distances from your device’s speaker. It also powers on-demand, high-performance smartphone capabilities such as instantaneous language translation and more.

Nvidia CEO Jensen Huang holds up the Jetson Nano onstage during the GTC keynote address in San Jose, California — its smallest computer ever.

This new embedded computer in its Jetson line for developers deploying AI on the edge and the goal is to make them affordable.

The Jetson Nano developer kit is available today for $100, while the $129 Jetson Mini computer for embedded devices will be available in June.

I’ve long thought that the technology sometimes resembled the fashion industry in that trends come, go, and come back albeit in slightly differently. The recent rise of “edge computing” bears witness to this idea.

In this video, a16z partner Peter Levine takes us on a “crazy” tour of the history and future of cloud computing — from the constant turns between centralized to distributed computing, and even to his “Forrest Gump rule” of investing in these shifts.