Рет қаралды 4,286
This video starts off by defining the Cost function, applied after feedforward, to calculate the cost. Then we look at how Gradient Descent works in detail and how it arrives at the optimal values of weights and biases using partial derivatives. Next we start with the backpropagation algorithm - we learn how it works and derive the equations using the chain rule. Finally we conclude with the general form of the backpropagation algorithm.
This is the second video of my series Neural Networks from Scratch in Python - where I explain the inner workings of a neural network and finally implement it from scratch in python.
Part 1: Feedforward Explained • Feedforward Explained ...
Part 2: Building a Neural Network from Scratch in Python • Building a Neural Netw...
References:
www.coursera.o...
towardsdatasci...
towardsdatasci...
towardsdatasci...
Having trouble ? Need help ? Connect with me !
Email: adarsh1021@gmail.com
Twitter : / adarsh_menon_
LinkedIn: / adarsh-me. .
Github : github.com/ada...
#neuralnetworks #deeplearning #backpropagation #python