I’ll never forget the time I first heard of non-Euclidean spaces. It made sense and no-sense all at the same time. Since making the switch into data science, I understood it better and its uses. However, I never really tried to visualize these spaces.

Fortunately(?), someone has created a rendering engine that lets you explore this space and surprise(!), it may have uses for VR.

Here’s an interesting story about data analytics, specifically NLP, and data visualization can breathe new life into classic works of literature.

Phil Harvey, a Cloud Solution Architect at Microsoft in the UK, used the company’s Text Analytics API on 19 of The Bard’s plays. The API, which is available to anyone as part of Microsoft’s Azure Cognitive Services, can be used to identify sentiment and topics in text, as well as pick out key phrases and entities. This API is one of several Natural Language Processing (NLP) tools available on Azure.

As an added bonus, I think there should be an AMC series set in Elizabethan times mirroring the events of Breaking Bad.

Last week, Microsoft announced the that Azure Data Box Edge  has gone GA.  Azure Data Box Edge is a hybrid cloud platform that brings compute and storage closer to the data source.

Forbes has a nice write up on the technology and why it’s crucial to hybrid cloud deployments.

Azure Data Box Edge is the cornerstone of Microsoft’s hybrid cloud platform. It plays a crucial role in the “intelligent cloud and intelligent edge” strategy of the company. The product belongs to the Azure Data Box portfolio that offers both online and offline solutions for transferring bulk data to the cloud.

Databricks, announced that it has open-sourced Delta Lake, a storage layer that makes it easier to ensure data integrity as new data flows into an enterprise’s data lake by bringing ACID transactions to these big data repositories. TechCrunch has an article detailing on why this is a big deal.

The tool provides the ability to enforce specific schemas (which can be changed as necessary), to create snapshots and to ingest streaming data or backfill the lake as a batch job. Delta Lake also uses the Spark engine to handle the metadata of the data lake (which by itself is often a big data problem). Over time, Databricks also plans to add an audit trail, among other things.

Here’s an interesting episode of AI Today where hosts Kathleen Walch and Ronald Schmelzer talk about the 7 Patterns of AI. Press the play button below to listen here or visit the show page.

Show Notes

AI Today Podcast #85: The Seven Patterns of AI

Cognilytica has spent a considerable amount of time on AI use cases and how different industries are using various AI and cognitive technologies and we’ve found that there are seven common patterns that seem to continuously show up in all these use cases. Some use cases use a single pattern for their application while others combine a few together.

Read more …

Databricks first introduced MLflow in last June. Immediately, startups and larger enterprises started using it to manage their machine learning lifecycles. Since then, more than 80 contributors from some 40 companies have worked on the open source machine learning tool, and it regularly sees more than 500,000 downloads per month.

And check out this recent news:

Unveiled at the Spark + AI Summit 2019, sponsored by Databricks, the new Databricks and Microsoft collaboration is a sign of the companies’ deepening ties, but it is also too new to say how effectively the partnership will advance MLflow for developers, said Mike Gualtieri, a Forrester analyst.

AI offers the chance for small businesses to compete more nimbly with larger competitors. No longer does it take a massive capital investment to procure serious computing power or massive amounts of storage. Both compute and storage are available on demand via the cloud and it’s the powerful combo of compute and data that make AI possible. However, like any new technology, there’s fact and fiction. What are the dangers and opportunities of machine learning for small businesses? And how can they avoid the pitfalls?

Like any new technology, Artificial Intelligence (AI) and Machine Learning (ML) remained largely available for the top brass of every industry at the initial stage. But like we have seen with many new technologies and innovations in the past, they are slowly going to be available for others in the long run. Already signs are ripe that AI and ML are going to play a major role in offering a level playing field to the small and medium-sized businesses of the future.

Eamon O’Reilly joins Scott Hanselman to show how PowerShell in Azure Functions makes it possible for you to automate operational tasks and take advantage of the native Azure integration to deliver and maintain service     

Here’s an interesting perspective on what blockchain and open source have in common and how they will enrich each other in the years to come.

The many similarities between blockchain and the open source are not just a coincidence. Analysts and developers believe that the new technology is picking up from where open source left off. There is a limit to what companies can share with open source. Open source is not known to open up live systems and it can never open their data.

Here’s an interesting look on the use of AI and machine learning in the geospatial world.  Given the huge datasets found in remote sensing, it’s not surprising to see that field leading the way in cutting edge data analytics.

From a geospatial perspective, machine learning has long been in wide use. Remote sensing datasets have always been large, so the large data processing power of Machine Learning has been a natural fit. For example, processing satellite images using K Means or ISODATA clustering algorithms was one of the first uses of remote sensing software.