Ridge Regression is a neat little way to ensure you don’t overfit your training data - essentially, you are desensitizing your model to the training data. It can also help you solve unsolvable equations, and if that isn’t bad to the bone, I don’t know what is.
This StatQuest follows up on the StatQuests on:
Bias and Variance
Linear Models Part 1: Linear Regression
Linear Models Part 1.5: Multiple Regression
Linear Models Part 2: t-Tests and ANOVA
Linear Models Part 3: Design Matrices
Cross Validation:
For a complete index of all the StatQuest videos, check out:
If you’d like to support StatQuest, please consider...
Buying The StatQuest Illustrated Guide to Machine Learning!!!
PDF -
Paperback -
Kindle eBook -
Patreon:
...or...
YouTube Membership:
...a cool StatQuest t-shirt or sweatshirt:
...buying one or two of my songs (or go large and get a whole album!)
...or just donating to StatQuest!
Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
0:00 Awesome song and introduction
1:25 Ridge Regression main ideas
4:15 Ridge Regression details
10:21 Ridge Regression for discrete variables
13:24 Ridge Regression for Logistic Regression
14:12 Ridge Regression for fancy models
15:34 Ridge Regression when you don’t have much data
19:15 Summary of concepts
Correction:
13:39 I meant to say “Negative Log-Likelihood“ instead of “Likelihood“.
#statquest #regularization