Deep learning can be a complex and daunting field for newcomers.

Concepts like hidden layers, convolutional neural networks, backpropagation keep coming up as you try to grasp deep learning topics. Most people are put off of the math alone.

Despite what you have been led to believe, you don’t need an advanced degree or a Ph.D. to learn and master deep learning.

There are certain key concepts you should know (and be well versed in) before you plunge too far into the deep learning world.

The five essentials for starting your deep learning journey are:

  1. Getting your system ready
  2. Python programming
  3. Linear Algebra and Calculus
  4. Probability and Statistics
  5. Key Machine Learning Concepts

Lex Fridman interviews John Hopfield, a professor at Princeton, whose life’s work weaved beautifully through biology, chemistry, neuroscience, and physics.

Most crucially, he saw the messy world of biology through the piercing eyes of a physicist. He is perhaps best known for his work on associate neural networks, now known as Hopfield networks that were one of the early ideas that catalyzed

Timeline:

  • 0:00 – Introduction
  • 2:35 – Difference between biological and artificial neural networks
  • 8:49 – Adaptation
  • 13:45 – Physics view of the mind
  • 23:03 – Hopfield networks and associative memory
  • 35:22 – Boltzmann machines
  • 37:29 – Learning
  • 39:53 – Consciousness
  • 48:45 – Attractor networks and dynamical systems
  • 53:14 – How do we build intelligent systems?
  • 57:11 – Deep thinking as the way to arrive at breakthroughs
  • 59:12 – Brain-computer interfaces
  • 1:06:10 – Mortality
  • 1:08:12 – Meaning of life

Machine Learning with Phil ponders the question: “is it better to specialize or generalize in artificial intelligence and deep learning?”

The answer depends on your career aspirations. Do you want to be a deep learning research professor?

Do you want to go to work for Google, Facebook, or other global mega corporations?

Or do you want to be your own unicorn start up founder?

Each has their own specialization requirements that Phil breaks down in this video.

Lex Fridman lands an interview with the one and only Andrew Ng.

Andrew Ng is one of the most impactful educators, researchers, innovators, and leaders in artificial intelligence and technology space in general. He co-founded Coursera and Google Brain, launched deeplearning.ai, Landing.ai, and the AI fund, and was the Chief Scientist at Baidu. As a Stanford professor, and with Coursera and deeplearning.ai, he has helped educate and inspire millions of students including me. This conversation is part of the Artificial Intelligence podcast.

OUTLINE:
0:00 – Introduction
2:23 – First few steps in AI
5:05 – Early days of online education
16:07 – Teaching on a whiteboard
17:46 – Pieter Abbeel and early research at Stanford
23:17 – Early days of deep learning
32:55 – Quick preview: deeplearning.ai, landing.ai, and AI fund
33:23 – deeplearning.ai: how to get started in deep learning
45:55 – Unsupervised learning
49:40 – deeplearning.ai (continued)
56:12 – Career in deep learning
58:56 – Should you get a PhD?
1:03:28 – AI fund – building startups
1:11:14 – Landing.ai – growing AI efforts in established companies
1:20:44 – Artificial general intelligence

Lex Fridman just uploaded the second part of his interview with Vladimir Vapnik.

Vladimir Vapnik is the co-inventor of support vector machines, support vector clustering, VC theory, and many foundational ideas in statistical learning. He was born in the Soviet Union, worked at the Institute of Control Sciences in Moscow, then in the US, worked at AT&T, NEC Labs, Facebook AI Research, and now is a professor at Columbia University. His work has been cited over 200,000 times. This conversation is part of the Artificial Intelligence podcast.