Chris Seferlis discusses one of the lesser known and newer Data Services in Azure, Data Explorer.
If you’re looking to run extremely fast queries over large sets of log and IoT data, this may be the right tool for you. I also discuss where it’s not a replacement for Azure Synapse or Azure Databricks, but works nicely alongside them in the overall architecture of the Azure Data Platform.
Erik Roll from BlueGranite steps through the current 2020 health outlook and environmental factors that impact organizational data. How is your data estate impacted, and how can the modern data platform in Azure be used to aid organizations considering current events?
Data, like our experiences, is always evolving and accumulating. To keep up, our mental models of the world must adapt to new data, some of which contains new dimensions – new ways of seeing things we had no conception of before. These mental models are not unlike a table’s schema, defining how we categorize and process new information.
This brings us to schema management. As business problems and requirements evolve over time, so too does the structure of your data. With Delta Lake, as the data changes, incorporating new dimensions is easy. Users have access to simple semantics to control the schema of their tables. These tools include schema enforcement, which prevents users from accidentally polluting their tables with mistakes or garbage data, as well as schema evolution, which enables them to automatically add new columns of rich data when those columns belong. In this webinar, we’ll dive into the use of these tools.
In this webinar you will learn about:
Understanding table schemas and schema enforcement
The transaction log is key to understanding Delta Lake because it is the common thread that runs through many of its most important features, including ACID transactions, scalable metadata handling, time travel, and more. In this session, we’ll explore what the Delta Lake transaction log is, how it works at the file level, and how it offers an elegant solution to the problem of multiple concurrent reads and writes.
In this tech talk you will learn about:
What is the Delta Lake Transaction Log
What is the transaction log used for?
How does the transaction log work?
Reviewing the Delta Lake transaction log at the file level
Dealing with multiple concurrent reads and writes
How the Delta Lake transaction log solves other use cases including Time Travel and Data Lineage and Debugging
In this episode of the AI Show, Qun Ying shows us how to build an end-to-end solution using the Anomaly Detector and Azure Databricks. This step by step demo detects numerical anomalies from streaming data coming through Azure Event Hubs.
Anomaly Detection on Streaming Data Using Azure Databricks Related Links