Lex Fridman just uploaded the second part of his interview with Vladimir Vapnik.

Vladimir Vapnik is the co-inventor of support vector machines, support vector clustering, VC theory, and many foundational ideas in statistical learning. He was born in the Soviet Union, worked at the Institute of Control Sciences in Moscow, then in the US, worked at AT&T, NEC Labs, Facebook AI Research, and now is a professor at Columbia University. His work has been cited over 200,000 times. This conversation is part of the Artificial Intelligence podcast.

Just as with the internet and the Interstate Highway System, military technology will lead the way in how IoT sensor data is collected and analyzed.

The 21st-century battlefield does not suffer from a shortage of sensors spread across soldier wearables, vehicles, drones, video cameras, spectrum, signal and radio sensors, cyber sensors and scores of other devices that comprise the Internet of Battlefield Things.

More sensors mean more data – too much data – which limits the DoD’s ability to turn that information into actionable intelligence in a timely fashion. But AI is poised to change that equation by shifting the burden from human to machine so that only the most relevant and timely data reaches those who need it.

Swift is headed towards a decidedly heavy AI future.

The core development team behind Apple’s Swift programming language has set priorities including refining the language for use in machine learning.

Ambitions in the machine learning space are part of plans to invest in “user-empowering directions” for the language. Apple is not the only company with machine learning ambitions for Swift; Google has integrated Swift with the TensorFlow machine learning library in a project called Swift for TensorFlow. And the Swift community has created Swift Numerics, a library that can be used for machine learning.

Normally, you don’t need to be a fortune teller to predict Oscar winners, but the Academy has sprung a few surprises in recent years.

Recently, a team of data scientists tested whether their machine learning model could outsmart the bookmakers — with mixed results.

The boffins behind the BigML machine learning platform made their predictions with a Deepnets model, an optimised implementation of the Deep Neural Networks supervised learning technique.

Its biggest miss was in the hotly-contested best picture category. The model correctly rejected the bookie’s favourite, 1917, to pick an outsider. But it ultimately went for the wrong one, plumping for Once Upon a Time in Hollywood ahead of surprise winner Parasite.

One of the great things about the current wave of AI innovation is the large number of open source tools, technologies, and frameworks.

From TensorFlow to Python, Kafka to PyTorch, there’s an explosion in diversity of data science and big data tool sets.

However, when it comes to putting these tools together and building real-world AI applications, regular companies suffer from a serious technology gap compared to technology firms.

Here’s an interesting peice from Datanami on how to make AI work in the enterpise.

Many of the latest open source AI technologies are not known for being easy to work with, and typically require highly skilled data scientists to use. This puts a cap the applicability of the AI tech, and limits its use to companies that have the budget to hire experienced data scientists.