Greetings from Norway! Your videos are very helpful! Thanks for sharing :)
@azahid95034 жыл бұрын
Backpropagation is an algorithm to compute the partial derivatives for a neural network which can be used in the gradient descent method to solve the optimization problem. Just to clarify the mix up.
@DanielWeikert5 жыл бұрын
Thanks Jeff. Could you dive into tf 2.0?
@wolfisraging5 жыл бұрын
Adamax is my favourite one. Second fav is Adam.
@ronmedina4295 жыл бұрын
Isn't $ abla_\theta J (\theta_{t-1})$ a more proper notation?
@HeatonResearch5 жыл бұрын
I've seen it a number of different ways, from t to t+1 to t-1. I don't think it would be correct to drop the the -1 and go to just t, because then we are using the gradients from the current iteration (t), using the weights of the current iteration (t), which has not been calculated yet. Essentially the left of the equal is current, which is calculated from right-side, t-1, the previous step.
@rudreshmehta65104 жыл бұрын
lower down the pitch through audio controller, else your tutorial are worth appreciating except the things taught in module 4 were not cleared enough. every other video are really nice. Thanks for that sir. Take it as unbiased suggestion
@stackexchange73535 жыл бұрын
Rmsprop freaks out a bit once it reaches the star.