PYTORCH COMMON MISTAKES - How To Save Time 🕒

In this video I show you 10 common Pytorch mistakes and by avoiding these you will save a lot time on debugging models. This was inspired by a tweet by Andrej Karpathy and that’s why I said it was approved by him :) Andrej Karpathy Tweet: People often ask what courses are great for getting into ML/DL and the two I started with is ML and DL specialization both by Andrew Ng. Below you’ll find both affiliate and non-affiliate links if you want to check it out. The pricing for you is the same but a small commission goes back to the channel if you buy it through the affiliate link. ML Course (affiliate): DL Specialization (affiliate): ML Course (no affiliate): DL Specialization (no affiliate): GitHub Repository: ✅ Equipment I use and recommend: ❤️ Become a Channel Member: ✅ One-Time Donations: Paypal: Ethereum: 0xc84008f43d2E0bC01d925CC35915CdE92c2e99dc ▶️ You Can Connect with me on: Twitter - LinkedIn - GitHub - PyTorch Playlist: OUTLINE: 0:00 - Introduction 0:21 - 1. Didn’t overfit batch 2:45 - 2. Forgot toggle train/eval 4:47 - 3. Forgot .zero_grad() 6:15 - 4. Softmax when using CrossEntropy 8:09 - 5. Bias term with BatchNorm 9:54 - 6. Using view as permute 12:10 - 7. Incorrect Data Augmentation 14:19 - 8. Not Shuffling Data 15:28 - 9. Not Normalizing Data 17:28 - 10. Not Clipping Gradients 18:40 - Which ones did I miss?
Back to Top