Ad

Azure Synapse workspaces can host a Spark cluster.

In addition to providing the execution environment for certain Synapse features such as Notebooks, you can also write custom code that runs as a job inside Synapse hosted Spark cluster.

This video walks through the process of running a C# custom Spark job in Azure Synapse. It shows how to create the Synapse workspace in the Azure portal, how to add a Spark pool, and how to configure a suitable storage account. It also shows how to write the custom job in C#, how to upload the built output to Azure, and then how to configure Azure Synapse to execute the .NET application as a custom job.

Topics/Time index:

  • Create a new Azure Synapse Analytics workspace (0:17)
  • Configuring security on the storage account (1:29)
  • Exploring the workspace (2:42)
  • Creating an Apache Spark pool (3:01)
  • Creating the C# application (4:05)
  • Adding a namespace directive to use Spark (SQL 4:48)
  • Creating the Spark session (5:01)
  • How the job will work (5:22)
  • Defining the work with Spark SQL (6:42)
  • Building the .NET application to upload to Azure Synapse (9:48)
  • Uploading our application to Azyure Synapse (11:45)
  • Using the ZIPed .NET application in a custom Spark job definition (12:39)
  • Testing the custom job (13:36)
  • Monitoring the job (13:56)
  • Inspecting the results (14:25)
tt ads

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.