The Complete Mathematics of Neural Networks and Deep Learning
A complete guide to the mathematics behind neural networks and backpropagation.
In this lecture, I aim to explain the mathematical phenomena, a combination of linear algebra and optimization, that underlie the most important algorithm in data science today: the feed forward neural network.
Through a plethora of examples, geometrical intuitions, and not-too-tedious proofs, I will guide you from understanding how backpropagation works in single neurons to entire networks, and why we need backpropagation anyways.
It’s a long lecture, so I encourage you to segment out your learning time - get a notebook and take some notes, and see if you can prove the theorems yourself.
As for me: I’m Adam Dhalla, a high school student from Vancouver, BC. I’m interested in how we can use algorithms from computer science to gain intuition about natural systems and environments.
My website:
I write here a lot:
Contact me: adamdhalla@pr