Backpropagation: how it works

  Рет қаралды 150,353

Victor Lavrenko

Victor Lavrenko

Күн бұрын

Пікірлер: 17
@alexanderkorsunsky2792
@alexanderkorsunsky2792 8 жыл бұрын
Thanks for the explanation! It's the clearest explanation I have found so far.
@fukeya
@fukeya 8 жыл бұрын
Really liked your short videos. Thanks!
@dimitrab6485
@dimitrab6485 7 жыл бұрын
Amazing video... simple and yet not oversimplified. Thank you very much for uploading.
@minecraftermad
@minecraftermad 7 жыл бұрын
what exactly does a node do with for example 4 inputs? does it add them together and divide by 4 then multiply by a number? what do the weights do? is it something you do to the input number or does it tell the node something?
@hosseinpourghaemi4600
@hosseinpourghaemi4600 7 жыл бұрын
Undoubtedly all of your videos are excellent and very comprehensible. It would be very nice if you could add some information about the topics that your cover in each videos So we can find right video easily. Thank you.
@linkolinkampret7840
@linkolinkampret7840 7 жыл бұрын
Hossein Pourghaemi O
@tommyunreal
@tommyunreal 8 жыл бұрын
Thank you for video, it's was very easy to understand!
@arahflorgallardo871
@arahflorgallardo871 7 жыл бұрын
Good day Sir. Can you please clarify your interpretation of the sigmoid function, 4:00 - 4:20? I can't hear it quite well. This vid helped me out too, thank you.
@sen15recess
@sen15recess 8 жыл бұрын
Thank you so much!
@MrAgreeandDisagree
@MrAgreeandDisagree 7 жыл бұрын
At 5:15 Victor says "f.k times u.jk is just one component of the sum that feeds into g.j, so I just need to differentiate". Could anyone explain that comment about differentiating? Why does that help us?
@abdul11235
@abdul11235 7 жыл бұрын
how are weights of edges computed and how do we know them before hand ?
@minecraftermad
@minecraftermad 7 жыл бұрын
they are randomized until you get good luck and the program works how you want
@kaleeswaranm2679
@kaleeswaranm2679 6 жыл бұрын
randomly initialized. check gradient descent for more info.
@RigatoniModular
@RigatoniModular 8 жыл бұрын
is dE/dhi outside or inside of the summation?
@ahabonatv9365
@ahabonatv9365 7 жыл бұрын
ok great
@NisseOhlsen
@NisseOhlsen 7 жыл бұрын
Thank you for the video. I'm sorry, I do not follow. If g = sigma(mu + sum (mu_k*h_k) then dg/d(h_k) = sigma'(h_k)*mu_k, which means that dg = sigma'(h_k)*mu_k * dh_k so dE/dg = (just inserting) dE/((sigma'(h)*mu_k * dh_k ) = 1/(sigma'(h_k)*mu_k * dh_k ) *dE/d(h_k), so NOT equal to (sigma'(h)*mu_k * dh_k ) *dE/d(h_k) ??
@minecraftermad
@minecraftermad 7 жыл бұрын
too bad i havent gotten to that state of mathematics in school yet
Neural Networks 11: Backpropagation in detail
5:13
Victor Lavrenko
Рет қаралды 22 М.
Bike Vs Tricycle Fast Challenge
00:43
Russo
Рет қаралды 92 МЛН
The joker favorite#joker  #shorts
00:15
Untitled Joker
Рет қаралды 25 МЛН
What is backpropagation really doing? | Chapter 3, Deep learning
12:47
3Blue1Brown
Рет қаралды 4,6 МЛН
Backpropagation Neural Network - How it Works e.g. Counting
6:52
Back Propagation in Neural Network with an example
12:45
Naveen Kumar
Рет қаралды 834 М.
12a: Neural Nets
50:43
MIT OpenCourseWare
Рет қаралды 529 М.
But what is a neural network? | Chapter 1, Deep learning
18:40
3Blue1Brown
Рет қаралды 17 МЛН
Backpropagation calculus | Chapter 4, Deep learning
10:18
3Blue1Brown
Рет қаралды 2,8 МЛН
Neural network tutorial: The back-propagation algorithm (Part 2)
13:34
Bike Vs Tricycle Fast Challenge
00:43
Russo
Рет қаралды 92 МЛН