Ad

Did you know that you can now train machine learning models with Azure ML once and deploy them in the Cloud (AKS/ACI) and on the edge (Azure IoT Edge) seamlessly thanks to ONNX Runtime inference engine.

In this new episode of the IoT Show, learn about the ONNX Runtime, the Microsoft built inference engine for ONNX models – its cross platform, cross training frameworks and op-par or better performance than existing inference engines.
From the description:
We will show how to train and containerize a machine learning model using Azure Machine Learning then deploy the trained model to a container service in the cloud and to an Azure IoT Edge device with IoT Edge across different HW platform – Intel, NVIDIA and Qualcomm.

tt ads

Leave a Reply

Your email address will not be published. Required fields are marked *
You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

This site uses Akismet to reduce spam. Learn how your comment data is processed.