Capacity planning plays a critical role in migrating an existing application or designing a new one. In the final part of this three-part series with Silvano Coriani, we’ll review various Azure SQL capacity planning scenarios, as well as review best practices and recommendations on what to use when. 

Time index:

  • 0:00 Introduction
  • 1:26 Greenfield development scenario
  • 5:42 Existing applications on-prem or other clouds scenario
  • 7:42 On-prem demo
  • 12:01 Changes in existing Azure SQL DB apps scenario

Adding GPU compute support to Windows Subsystem for Linux (WSL) has been the #1 most requested feature since the first WSL release.

Learn how Windows and WSL 2 now support GPU Accelerated Machine Learning (GPU compute) using NVIDIA CUDA, including TensorFlow and PyTorch, as well as all the Docker and NVIDIA Container Toolkit support available in a native Linux environment.

Clark Rahig will explain a bit about what it means to accelerate your GPU to help with training Machine Learning (ML) models, introducing concepts like parallelism, and then showing how to set up and run your full ML workflow (including GPU acceleration) with NVIDIA CUDA and TensorFlow in WSL 2.

Additionally, Clarke will demonstrate how students and beginners can start building knowledge in the Machine Learning (ML) space on their existing hardware by using the TensorFlow with DirectML package.

Learn more: