Christina Lee joins Scott Hanselman to show what’s new in Azure Cognitive Services.  Cognitive Services bring AI within reach of every developer—without requiring machine-learning expertise.  All it takes is an API call to embed the ability to see, hear, speak, search, understand, and accelerate decision-making into your apps.

Related Links:

azfr592_960

Microsoft Research  interviews Mark Hamilton to see how MMLSpark is helping to serve business and the environment.

If someone asked you what snow leopards and Vincent Van Gogh have in common, you might think it was the beginning of a joke. It’s not, but if it were, Mark Hamilton, a software engineer in Microsoft’s Cognitive Services group, budding PhD student and frequent Microsoft Research collaborator, would tell you the punchline is machine learning. More specifically, Microsoft Machine Learning for Apache Spark (MMLSpark for short), a powerful yet elastic open source machine learning library that’s finding its way beyond business and into “AI for Good” applications such as the environment and the arts.

Today, Mark talks about his love of mathematics and his desire to solve big, crazy, core knowledge sized problems; tells us all about MMLSpark and how it’s being used by organizations like the Snow Leopard Trust and the Metropolitan Museum of Art; and reveals how the persuasive advice of a really smart big sister helped launch an exciting career in AI research and development.

Leila Etaati, a Data Soup Summit speaker,  will be in the USA for a month and presenting a one day workshop at the following locations

 

Jacob Jedryszek joins Scott Hanselman to talk about about using Cognitive Services with Azure Search with your mobile and web apps. Skip hiring search experts who know what an inverted index is.

Don’t worry about distributed systems expertise to scale your service to handle large amount of data.

And forget about setting up, owning and managing the infrastructure. Let Azure Search do it all for you!

In this episode, learn how the Anomaly Detection service comes to your on-premises systems via containers.

By deploying the same API service close to your data in containers, now you don’t have to worry about situations when you have to keep the data on-premises to follow regulation, or to deal with network latency, or just want to reuse the same application powered-by Anomaly Detector across both the cloud and on-premise.

In this episode of the AI Show, get a look at a simple way to detect anomalies that can occur in your data.

Knowing when something goes off the rails is incredibly important and now easily done with a simple API call.

Azure Anomaly Detector Related Links

One of the best tools Microsoft currently has in its AI toolkit is the QnA Maker. It uses NLP to mine one or more source documents and expose the contents as a chatbot. It does a great job of answering, but a newly added feature (Multi-Trun) mimics a crucial ability that a real human adds: the ability to clarify, ask for more information, or do anything more than a one-off question-response type of conversation.

In this article, Matt Wade examines this feature and how to exploit it to make your QnA bots even smarter.

But that all changed with the recent introduction of QnA Maker multi-turn conversations. With multi-turn, the experience with your QnA KB is much more fluid and significantly more natural. Let’s see how.