Neural Network Backpropagation Example With Activation Function

  Рет қаралды 34,472

Mikael Laine

Mikael Laine

Күн бұрын

The simplest possible back propagation example done with the sigmoid activation function.
Some brief comments on how gradients are calculated in actual implementations.
Edit: there is a slight omission/error in the da/dw expression, as pointed out by Laurie Linnett. The video has da/dw = a(1-a), but it should be ia(1-a), because the argument to a is the function (iw), whose derivative (with respect to w) is i.

Пікірлер: 38
@maxim25o2
@maxim25o2 4 жыл бұрын
There is many peoples who are teaching back propagation, but after watching tons of movies I think not many of them know really how its works. No body are calculating and showing numbers in equation. This is first tutorial what are answering to all my questions to back propagation. Many others peoples just are copying work of somebody else not understanding it. Tutorial is greate, step by step, explaining equations and breaking it to simplest understanding form. Great Job!
@ss5380
@ss5380 3 ай бұрын
You are a life saver!! Thank you for breaking the whole process down in such an understandable way!!
@laurielinnett8072
@laurielinnett8072 4 жыл бұрын
I think da/dw should be i*a*(1-a). Let z=i*w, then a=1/(1+exp(-z)) and da/dz=a*(1-a). Then dz/dw=i, so da/dw=(da/dz)*(dz/dw)=i*a*(1-a) Never the less an excellent presentation Mikael showing backpropagation and weight updating for a simple example without distracting subscripts and superscripts. Keep up the good work. LML
@mikaellaine9490
@mikaellaine9490 4 жыл бұрын
Darn, you're correct! I forgot to add the derivative of the inner function w*i, which would indeed be i as a multiplier to a(1-a).
@yasserahmed2781
@yasserahmed2781 3 жыл бұрын
been repeating the calculations several times on paper and trying to understand how the "i" disappeared, i even thought that the video implicitly assumed that i was 1 or something haha should always check the comments right away.
@jackmiller2614
@jackmiller2614 3 жыл бұрын
Thanks so much for this video -- I have spent hours looking for a clean explanation of this and I have finally found it!
@redditrewindeverybody-subs9336
@redditrewindeverybody-subs9336 4 жыл бұрын
Thanks for your videos! I'm finally able to implement backpropagation because I (kinda) understood the Maths behind it thanks to you! Please keep more vids coming!
@Sandium
@Sandium 3 жыл бұрын
I was having difficulties wrapping my head around Backpropagation. Thank you very much for this video!
@vincentjr8013
@vincentjr8013 3 жыл бұрын
How bias will update for multilayer network?
@jiangfenglin4359
@jiangfenglin4359 3 жыл бұрын
Thank you so much for making these videos! I love your explanations. :)
@kyju77
@kyju77 Жыл бұрын
Hi, I will join to other with thanks for this video! Amazing explanation. Just one question: your example was made, let say, with single "training session". When I have dozens or hundreds "training session" I calculate average for final error. What about da/dw for example?? Shall I also calculate average for all trainings and then apply ? Or there is another approach ? Thanks again.
@FPChris
@FPChris 2 жыл бұрын
As you go back when do you update each weight? Do you go back to w1, adjust it, do a new forward pass, go back only to w2, do a new forward pass, go back only to w3. ?
@dmdjt
@dmdjt 4 жыл бұрын
Thank you very much for your effort and excellent explanation!
@sumayyakamal8857
@sumayyakamal8857 3 жыл бұрын
Thank you so much. I often hear Hadamard multiplication is used, but that's used for what?
@nickpelov
@nickpelov Жыл бұрын
Question: there are faster activation functions, but how do they affect the backpropagation? When using sigmoid function it also is contained in the derivative. That's not the case for other functions. Is it worth the effort when the back propagation would be a lot slower? Well once the network is finished it'll be used multiple times, so I guess you can use a lot more computing power on learning and using the network on a device with less computing power. Correct me if I'm wrong.
@flavialan4544
@flavialan4544 3 жыл бұрын
You are a real teacher!
@onesun3023
@onesun3023 4 жыл бұрын
Why do you use lowercase Phi for the activation?
@zamanmakan2729
@zamanmakan2729 3 жыл бұрын
Sorry, a(1-a) is the derivative of what? I didn't get how we reached there.
@nasirrahim5610
@nasirrahim5610 Жыл бұрын
Your explanation is amazing 👏
@raymond5887
@raymond5887 4 жыл бұрын
Thanks for the awesome explanation! I finally know how to do back prop now haha.
@obsidianhead
@obsidianhead 3 жыл бұрын
Thank you for this excellent video
@justchary
@justchary Жыл бұрын
Thank you very much. This was very helpful.
@nickpelov
@nickpelov Жыл бұрын
In the table at 12:37 there is no way to see when you should stop. maybe you should have included the actual output y, or at least show y on screen. So the goal is to reach a=0.5 right?
@nickpelov
@nickpelov Жыл бұрын
I don't understand why you would calculate da/dw in advance and not during the back propagation. Do we use it more than once? For each iteration the da/dw has different value, so I don't see why we should calculate it upfront. We can just take the output a and calculate a(1-a) during the backpropagation.
@benwan8927
@benwan8927 2 жыл бұрын
good and clear explanation
@amukh1_dev274
@amukh1_dev274 Жыл бұрын
Thank you! You earned a sub ❤🎉
@BB-sd6sm
@BB-sd6sm 3 жыл бұрын
great video mate
@trevortyne534
@trevortyne534 Жыл бұрын
Excellent explanation Mikael ! Trev T Sydney
@TheRainHarvester
@TheRainHarvester Жыл бұрын
It seems like picking up the numbers would require indirection /following pointers /memory fetch to slow memory, but just recalculating would take fewer clock cycles.
@TheRainHarvester
@TheRainHarvester Жыл бұрын
Storing probably wins vs recursive calcs which would be required for multiple branches of a wide NN.
@kishorb.surwade6722
@kishorb.surwade6722 3 жыл бұрын
Nice explanation. One special request, if you can give illustration in MS EXCEL, it will give more understanding
@youssryhamdy4923
@youssryhamdy4923 2 жыл бұрын
Sound of this video is low, please, try to make it higher. Thanks
@edwardmontague2021
@edwardmontague2021 Жыл бұрын
Defined as a function in Maxima CAS . sigmoid(x):=1/(1+exp(-x))$ About da/dw = d sigmoid( w*x + b) / dw , where x ==a from previous layer. Using Maxima CAS, I obtain (x*%e^(w*x+b))/(2*%e^(w*x+b)+%e^(2*w*x)+%e^(2*b)). Whereas with a= sigmoid( w*x + b) and the derivative defined as a*(1-a) I obtain (%e^(w*x+b))/(2*%e^(w*x+b)+%e^(2*w*x)+%e^(2*b)) ; which differs by the multiplier x . Which is correct ?
@andreaardemagni6401
@andreaardemagni6401 8 ай бұрын
Unfortunately the volume of this video is too low to watch it from the phone. Such a shame :(
4 жыл бұрын
please make videos about neural networks on python
@knowledgeanddefense1054
@knowledgeanddefense1054 Жыл бұрын
Fun fact, did you know Einstein and Hawking were socialists? Just thought you may find that interesting :)
@vidumini23
@vidumini23 3 жыл бұрын
Thank you so much for the clear excellent explanation and effort.
Don’t take steroids ! 🙏🙏
00:16
Tibo InShape
Рет қаралды 98 МЛН
КАК СПРЯТАТЬ КОНФЕТЫ
00:59
123 GO! Shorts Russian
Рет қаралды 2,8 МЛН
Chips evolution !! 😔😔
00:23
Tibo InShape
Рет қаралды 42 МЛН
Neural Network from Scratch | Mathematics & Python Code
32:32
The Independent Code
Рет қаралды 116 М.
Back Propagation in training neural networks step by step
32:48
Bevan Smith 2
Рет қаралды 44 М.
01L - Gradient descent and the backpropagation algorithm
1:51:04
Alfredo Canziani
Рет қаралды 53 М.
Another Simple Neural Network Backpropagated
16:47
Mikael Laine
Рет қаралды 5 М.
Understanding Backpropagation In Neural Networks with Basic Calculus
24:28
Backpropagation Algorithm | Neural Networks
13:14
First Principles of Computer Vision
Рет қаралды 31 М.
Watching Neural Networks Learn
25:28
Emergent Garden
Рет қаралды 1,1 МЛН
But what is a neural network? | Chapter 1, Deep learning
18:40
3Blue1Brown
Рет қаралды 16 МЛН
ПК с Авито за 3000р
0:58
ЖЕЛЕЗНЫЙ КОРОЛЬ
Рет қаралды 1,4 МЛН
Эффект Карбонаро и бумажный телефон
1:01
История одного вокалиста
Рет қаралды 2,6 МЛН
⌨️ Сколько всего у меня клавиатур? #обзор
0:41
Гранатка — про VR и девайсы
Рет қаралды 653 М.
С Какой Высоты Разобьётся NOKIA3310 ?!😳
0:43