Here’s an interesting new product from Microsoft: Azure Lighthouse.

Grow your business profitably and efficiently to service more customers, larger workloads, and most mission-critical apps with precision on Microsoft Azure. Azure Lighthouse provides advanced automation on Azure for you to confidently manage multiple customers’ Azure estates at scale and protect your management IP.

With Azure Lighthouse, your customers can have greater visibility into service provider activities, increasing transparency and trust. Discover how this foundational management capability works consistently across Azure services and licensing models to help streamline managed service operations and can help our partners focus increasingly more on providing differentiated services to customers. Learn more: https://aka.ms/azurelighthouse

AI has become a major driver of edge computing adoption. The edge computing layer was originally only meant to deliver lower compute, storage and processing capabilities to IoT deployments. As well as for  sensitive data that could not be sent to the cloud for analysis and processing is also handled at the edge.

Here’s an overview of the players and the state of the edge computing art.

Three AI accelerators present on the market today are NVIDIA Jetson, Intel Movidius and Myriad Chips, and finally the Google Edge Tensor Processing Units. All three are highly optimized for edge pipeline workflow and will see an increase in usage over the coming years. As AI continues to become a key driver of the edge, the combination of hardware accelerators and software platforms is becoming important to run the models for inferencing. By accelerating AI inferencing, the edge will become an even more valuable tool and change the ML pipeline as we know it.

ML.NET, Microsoft’s open source machine learning framework, has been updated to version 1.2. Here’s a quick rundown of the updates. Read the article on Visual Studio Magazine to find out more.

  • General availability of TimeSeries support for forecasting and anomaly detection:
  • General availability of ML.NET packages to use TensorFlow and ONNX models:
  • Easily integrate ML.NET models in web or serverless apps with Microsoft.Extensions.ML integration package (preview):
  • ML.NET CLI updated to 0.14 (preview):
  • Model Builder updates:
    • Expanding support to .txt files and more delimiters for values

    • No limits on training data size

    • Smart defaults for training time for large datasets

    • Improved model consumption experience