StatQuest with Josh Starmer explains Naive Bayes in a clear way.

When most people want to learn about Naive Bayes, they want to learn about the Multinomial Naive Bayes Classifier – which sounds really fancy, but is actually quite simple. This video walks you through it one step at a time and by the end, you’ll no longer be naive about Naive Bayes!!


  • 0:00 Awesome song and introduction
  • 1:08 Histograms and conditional probabilities
  • 4:22 Classifying “Dear Friend”
  • 7:33 Review of concepts
  • 9:00 Classifying “Lunch Money x 5”
  • 10:54 Pseudocounts
  • 12:35 Why Naive Bayes is Naive

Gradient Descent is the workhorse behind much of Machine Learning. When you fit a machine learning method to a training dataset, you’re almost certainly using Gradient Descent.

The process can optimize parameters in a wide variety of settings. Since it’s so fundamental to Machine Learning, Josh Starmer of StatQuest decided to make a “step-by-step” video that shows exactly how it works.

Heads up: there is some singing.