Microsoft Research has a new podcast out talking about learning algorithms.

Deep learning methodologies like supervised learning have been very successful in training machines to make predictions about the world. But because they’re so dependent upon large amounts of human-annotated data, they’ve been difficult to scale. Dr. Phil Bachman, a researcher at MSR Montreal, would like to change that, and he’s working to train machines to collect, sort and label their own data, so people don’t have to.

Today, Dr. Bachman gives us an overview of the machine learning landscape and tells us why it’s been so difficult to sort through noise and get to useful information. He also talks about his ongoing work on Deep InfoMax, a novel approach to self-supervised learning, and reveals what a conversation about ML classification problems has to do with Harrison Ford’s face.

Microsoft Research has posted this interesting video:

To develop an Artificial Intelligence (AI) system that can understand the world around us, it needs to be able to interpret and reason about the world we see and the language we speak. In recent years, there has been a lot of attention to research at the intersection of vision, temporal reasoning, and language.

One of the major challenges is how to ensure proper grounding and perform reasoning across multiple modalities given the heterogeneity resides in the data when there is no or weak supervision of the data.

Talk slides: https://www.microsoft.com/en-us/research/uploads/prod/2019/11/Towards-Grounded-Spatio-Temporal-Reasoning-SLIDES.pdf

Deep learning has had enormous success on perceptual tasks but still struggles in providing a model for inference. Here’s an interesting talk about making neural networks that can reason.

To address this gap, we have been developing networks that support memory, attention, composition, and reasoning. Our MACnet and NSM designs provide a strong prior for explicitly iterative reasoning, enabling them to learn explainable, structured reasoning, as well as achieve good generalization from a modest amount of data. The Neural State Machine (NSM) design also emphasizes the use of a more symbolic form of internal computation, represented as attention over symbols, which have distributed representations. Such designs impose structural priors on the operation of networks and encourage certain kinds of modularity and generalization. We demonstrate the models’ strength, robustness, and data efficiency on the CLEVR dataset for visual reasoning (Johnson et al. 2016), VQA-CP, which emphasizes disentanglement (Agrawal et al. 2018), and our own GQA (Hudson and Manning 2019). Joint work with Drew Hudson.

In this video Siraj Raval announces that School of AI is now accepting applications for research fellows in 2019.

From the video description:

We’ll select 10 Fellows and give them 1K USD in Google Cloud credits each, a personal advisor, and help them submit their work to relevant academic outlets like NIPS and popular journals. The deadline for submissions is May 15 2019 and I look forward to your applications! Our 10 Fellows from 2018 did some amazing work, I’ll explain what they did and give guidelines as to what we’re looking for this round. Enjoy!

Application form: https://forms.gle/dJmnNkKPvjzWWJ9L9

Here’s an interesting news article from MIT that could revolutionize NLP and further NLU (Natural Language Understanding).

Children learn language by observing their environment, listening to the people around them, and connecting the dots between what they see and hear. Among other things, this helps children establish their language’s word order, such as where subjects and verbs fall in a sentence. In computing, learning language is […]

With data storage demands increasing every day, conventional storage will not be enough in the future. Enter DNA-based storage, with its ability to store information on a molecular level, it could revolutionize data storage in the age beyond big data. And researchers have recently came one step closer to making this technology real.

Researchers at Microsoft and the late Microsoft founder Paul Allen’s school of computing science at the University of Washington has built a system of liquids, tubes, syringes, and electronics around a benchtop to deliver the world’s first automated DNA storage device.