Backpropagation in CNN - Part 1

  Рет қаралды 49,649

Coding Lane

Coding Lane

Күн бұрын

Backpropagation in CNN is one of the very difficult concept to understand. And I have seen very few people actually producing content on this topic.
So here in this video, we will understand Backpropagation in CNN properly. This is part 1 of this tutorial, and in this is we will just look at Backpropagation for Convolutional Operation. In part 2, we will see how the gradients propagate backward in the entire architecture.
All the frameworks used for Deep Learning automatically implement Backpropagation for CNN. But as we humans are curious, we want to know how it works and not let it be implemented automatically.
So buckle up! And let's understand Backpropagation in CNN.
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
Timestamp:
0:00 Intro
1:49 What to obtain
4:22 dL/dK
11:46 dL/dB
13:20 dL/dX
18:51 End
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
📕 PDF notes for this video: bit.ly/BackPropCNNP1
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
Follow my entire playlist on Convolutional Neural Network (CNN) :
📕 CNN Playlist: • What is CNN in deep le...
At the end of some videos, you will also find quizzes 📑 that can help you to understand the concept and retain your learning.
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
✔ Complete Neural Network Playlist: • How Neural Networks wo...
✔ Complete Logistic Regression Playlist: • Logistic Regression Ma...
✔ Complete Linear Regression Playlist: • What is Linear Regress...
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
If you want to ride on the Lane of Machine Learning, then Subscribe ▶ to my channel here: / @codinglane

Пікірлер: 134
@PranjS
@PranjS 8 ай бұрын
You will be remembered by many future developers as the person who helped them clear their concepts and probably also the person who helped them crack their job in this field. A big thanks for your work!
@CodingLane
@CodingLane 8 ай бұрын
Wow… so glad to hear this Pranjal. Thank you so much for your words 😇
@waleedrafi7977
@waleedrafi7977 2 жыл бұрын
Even though I have already watched your videos but due to your hard work and spending a lot of time on research, I also watch the whole video from my other accounts just to support you to increase your watch time, etc. so you don't stop making videos & I will highly encourage you to continue your hard work, one day you have a big audience.
@CodingLane
@CodingLane 2 жыл бұрын
I don’t even how to react to this great comment 😄. Probably the best coment I have seen. Thank you so much. It really means a lot to me. I will keep on making such videos!
@daniteka
@daniteka 6 ай бұрын
I wish I could show my appreciation by liking your videos multiple times, but unfortunately, the system limits us to a single like per video. Thank you so much for sharing your knowledge and expertise!
@muskanmahajan04
@muskanmahajan04 2 жыл бұрын
Prolly the best explanation on CNN Backprop there is on yt , thank you Jay!
@CodingLane
@CodingLane 2 жыл бұрын
Your welcome. Glad to help!
@user-pg4ui4us1u
@user-pg4ui4us1u 8 ай бұрын
An absolutely perfect playlist that provided a wealth of insights - a huge thanks!
@nadiaaouadi4266
@nadiaaouadi4266 Жыл бұрын
thanx for the whole series, I hope that you continue uploading videos in the future because they are amazing! *I never comment on videos but this was totally worth it*
@devvratsahai2068
@devvratsahai2068 Жыл бұрын
you're a life saver man!!
@arashroshanpoor1682
@arashroshanpoor1682 2 ай бұрын
This is the best explanation I looking for. thx
@avisinh7249
@avisinh7249 Жыл бұрын
Best example and explanation I have seen. Thanks!
@jordiwang
@jordiwang Жыл бұрын
again bro, its quiet clear and I am loving it appreciate it
@mariap.9768
@mariap.9768 Жыл бұрын
Excellent work, I like how you get to the point quickly.
@locostineverything4810
@locostineverything4810 6 ай бұрын
Thanks for all the contents so far , Appreciate it !
@shrotkumarsrivastava2441
@shrotkumarsrivastava2441 Жыл бұрын
This is awesome and rare. Really appreciate the work you have put in here. Thanks for it. Keep rocking.
@bevel1702
@bevel1702 2 жыл бұрын
Very down to the point and descriptive, didn't think I'd be able to understand this but the way you described it made it crystal clear. Good work!
@CodingLane
@CodingLane 2 жыл бұрын
Thank you 😇
@WilliamJung98
@WilliamJung98 Жыл бұрын
This is incredibly helpful. Thank you so much.
@jaiminjariwala5
@jaiminjariwala5 Жыл бұрын
Best Explanation Ever! Thank You so much Brother!
@user-ob5ts8yc7v
@user-ob5ts8yc7v Жыл бұрын
I'm Korean subscriber, you contents is very helpful for me. Thank you!
@Pierredefermatteee
@Pierredefermatteee 2 жыл бұрын
Thank you for the excellent content that you provided for no cost. Please continue making such precious videos, I know you will rock it.
@CodingLane
@CodingLane 2 жыл бұрын
Thank You for your kind words!
@MrAMerang
@MrAMerang 3 ай бұрын
BEST youtube on this topic, Thank you very much.
@Maciek17PL
@Maciek17PL Жыл бұрын
Awesome video, crystal clear explanation!!!
@nurabba3273
@nurabba3273 8 ай бұрын
Hello brother. I am not the first I know to get enlightened by your educative videos. But I will say this, you have save my days by giving me so much insight on machine learning stuffs. Thank you
@emanuel8418
@emanuel8418 2 жыл бұрын
I just can't describe how much you helped me with this video. Thank you so much.
@CodingLane
@CodingLane 2 жыл бұрын
Glad to help 🤗!
@jackrozmaryn7905
@jackrozmaryn7905 2 ай бұрын
Jay, Excellent, Excellent, Excellent and excellent thank you!
@josephj1643
@josephj1643 Жыл бұрын
Hey, Thank you for making this video. Most of the people on KZbin have explained CNN, but have not explained how Back Prop works in CNN saying it is similar to DNN. I was looking for this video, since a week, it really helped. Do continue making such videos!!!
@CodingLane
@CodingLane Жыл бұрын
Glad it was helpful! Sure, will keep uploading more videos. 🙂
@kumkumupreti
@kumkumupreti 6 ай бұрын
Thank you so much...u really make every concept crystal clear ❤
@thienlu7011
@thienlu7011 10 ай бұрын
Thank you a lot! Very intuitive explaination.
@naseeruddin832
@naseeruddin832 2 жыл бұрын
At last, found one the real backprop of CNN. Thanks buddy
@CodingLane
@CodingLane 2 жыл бұрын
Your welcome! Glad it helped.
@lz-ym5eq
@lz-ym5eq 6 ай бұрын
Thank you for an explanation. It helped me a lot
@shahnewazchowdhury4175
@shahnewazchowdhury4175 Жыл бұрын
This is a fantastic video. Keep up the great work! It's surprising this does not have several hundred thousand views already.
@CodingLane
@CodingLane Жыл бұрын
Thanks, will do! 🙂
@AnkurKumar-dc3db
@AnkurKumar-dc3db Жыл бұрын
Finally after watching this video I now fully understand the backpropagation through CNNs. Thanks man for creating this video.
@CodingLane
@CodingLane Жыл бұрын
Happy to help! 🙂
@farrugiamarc0
@farrugiamarc0 3 ай бұрын
Very well explained despite this being a very complex topic and very challenging to teach. Well done!
@alexsemchenkov5740
@alexsemchenkov5740 Жыл бұрын
The best explanation on the internet! Thanks!
@CodingLane
@CodingLane Жыл бұрын
Thank you. Much appreciate your words!! 🙂😄
@tnmyk_
@tnmyk_ 2 жыл бұрын
Amazing lecture! Very well explained! Keep up the good work, man!
@CodingLane
@CodingLane 2 жыл бұрын
Thank you
@BlackmetalSM
@BlackmetalSM Жыл бұрын
Even Chat GPT was not as clear as you. Great job!
@user-oq7ju6vp7j
@user-oq7ju6vp7j 7 ай бұрын
Thank you for your videos!They are very helpfull for those of us, who are making NN's from srcatch.
@anshulsaini5401
@anshulsaini5401 2 жыл бұрын
Man, this is prolly the best video I have seen so far. Literally thanks for this video I have been struggling with backpropagation for 2 weeks and today I feel like ik everything lmao. Hats off to you brother keep grinding, keep rising. Definitely sharing this playlist (especially this video) in my circle!!
@CodingLane
@CodingLane 2 жыл бұрын
Thank you so much for your comment. I feel very good seeing this. Glad to see that it helped you. Comments like these keeps me going and make me create more such content 😇.
@vandanvirani167
@vandanvirani167 2 жыл бұрын
Dude really great I was searching whole over internet to find perfect explanations of cnn back propagation but on one explain like you even best teachers on KZbin
@CodingLane
@CodingLane 2 жыл бұрын
Thank you. Glad I could help 😇
@adelzier4264
@adelzier4264 10 ай бұрын
You are the best bro ! Hope you all the best
@CodingLane
@CodingLane 10 ай бұрын
Hey… thank you!
@SharathsooryaBCS
@SharathsooryaBCS Жыл бұрын
Sir, Really wonderfull explanation sir. I got the concept at the very first attempt. Thank you so much sir.
@ce108meetsaraiya4
@ce108meetsaraiya4 3 ай бұрын
This is best explanation of backpropogation in CNN
@CodingLane
@CodingLane 3 ай бұрын
Thank you!
@mukandrathee
@mukandrathee 2 ай бұрын
U made it so easy. Thanks
@CodingLane
@CodingLane 2 ай бұрын
Glad to help!
@Aca99100
@Aca99100 2 жыл бұрын
Hey Coding Lane! This tutorial was a life saver for me and just what I was looking for. As you said there arent many sources on the internet that explain backprop in CNN's especially in this depth. Thank you for this video, you got yourself likes on both parts and a new sub! Keep doing what youre doing!
@CodingLane
@CodingLane 2 жыл бұрын
Happy to help! And thank you for the comment 😇
@learningfoundation6601
@learningfoundation6601 Жыл бұрын
Amazing content keep uploading
@sushantregmi2126
@sushantregmi2126 Жыл бұрын
your content is unreal, thank you very much...
@CodingLane
@CodingLane Жыл бұрын
Thank you so much. Glad you find it helpful! 😀
@user-do3mw8yx4y
@user-do3mw8yx4y Жыл бұрын
Thank you so much!!
@jameshaochi0824
@jameshaochi0824 Жыл бұрын
Thank you so much.
@getisbhai
@getisbhai Жыл бұрын
amazing video bhai
@RH-mk3rp
@RH-mk3rp Жыл бұрын
You are an AMAZING PERSON, THANK YOU.
@CodingLane
@CodingLane Жыл бұрын
You are so welcome! Thank you! 🙂
@RH-mk3rp
@RH-mk3rp Жыл бұрын
@@CodingLane I retract my statement. Your convolution backpropagation assumes batch_size=1 and channels=1, perhaps to simplify the problem. but in practice this is never the case. If you explain all that in another video then I apologize, but thus far I have found none.
@CodingLane
@CodingLane Жыл бұрын
@@RH-mk3rp Hi, it would have been difficult to understand the backpropagation if batch_size is taken greater than 1. The purpose of this video was to help viewers understand how backpropagation works. But I hope you find and understand the solution where batch_size > 1 and channels > 1 😇
@enricollen
@enricollen Жыл бұрын
thanks sir, appreciate it
@omerihtizaz9043
@omerihtizaz9043 2 жыл бұрын
Great video, Keep up the good work!
@CodingLane
@CodingLane 2 жыл бұрын
Thank you!
@ummayhaney4162
@ummayhaney4162 Ай бұрын
Thank you❤💙
@shreedevisindagi888
@shreedevisindagi888 5 ай бұрын
Thank you .
@VR-fh4im
@VR-fh4im 11 ай бұрын
Brilliant.
@cobblin_gock
@cobblin_gock Жыл бұрын
awesome work. well explained.
@CodingLane
@CodingLane Жыл бұрын
Thank you!
@gauravshinde8767
@gauravshinde8767 4 ай бұрын
Doing Masters in AI in Ireland. Lectures: Understood nothing Coding Lane videos: Understood everything Teaching is a skill, you got that bro
@firstkaransingh
@firstkaransingh Жыл бұрын
Good explanation bruh
@rimeblb573
@rimeblb573 Ай бұрын
thank you
@jgg0207
@jgg0207 2 жыл бұрын
preciate this!! Very helpful
@CodingLane
@CodingLane 2 жыл бұрын
😇
@joelrcha3368
@joelrcha3368 2 жыл бұрын
Great explanation
@CodingLane
@CodingLane 2 жыл бұрын
Thanks
@qaiserali6773
@qaiserali6773 Жыл бұрын
Lovely!!
@CodingLane
@CodingLane Жыл бұрын
Thank you! 😊
@dhawalpatil7779
@dhawalpatil7779 Жыл бұрын
Hey coding lane I have only 2 words to say YOU ROCK 🎉
@CodingLane
@CodingLane Жыл бұрын
Hahaha… Thanks a lot! Cheers 🎉
@a.k.103
@a.k.103 2 жыл бұрын
bhai tere liye mere pass koi shabd nahi h ( I don't have any words for you), I have never seen content like this. i can write like this comment on every video but it will also be so less against your hard work. keep this up I always support you. Love you, bro.
@CodingLane
@CodingLane 2 жыл бұрын
Thank you so much 😊. I am really glad that my content is this much valuable. This is One of the most loving comment i have seen.
@waleedrafi1509
@waleedrafi1509 2 жыл бұрын
Great Video.
@CodingLane
@CodingLane 2 жыл бұрын
Thank you!
@DangNguyen-yh5mm
@DangNguyen-yh5mm 2 жыл бұрын
Amazing job, hope you have an example on simple CNN with updating weights on these layer(conv->relu->pool->... just a few layers), thank you for your this video.
@CodingLane
@CodingLane 2 жыл бұрын
Yes... that is coming in the next video
@punamkhandar2678
@punamkhandar2678 3 ай бұрын
informative
@gajendrasinghdhaked
@gajendrasinghdhaked 7 ай бұрын
insane content
@fitrinailahanwar4102
@fitrinailahanwar4102 Жыл бұрын
Clear explain, thank you sir, but please add subtittle in order can be easily to understand , thank you if you read this suggest
@CodingLane
@CodingLane Жыл бұрын
Hi, KZbin somehow faced error to add subtitles in this video. Sorry for the inconvenience. But glad you found it helpful!
@John-wx3zn
@John-wx3zn 2 ай бұрын
thank you. why is del L/del X being found when backpropagation only updates the kernel weights matrix and the bias scalar?
@lodemrakesh7092
@lodemrakesh7092 2 жыл бұрын
Good one
@CodingLane
@CodingLane 2 жыл бұрын
Thanks
@ness3963
@ness3963 2 жыл бұрын
Thank you bhai 🙁
@CodingLane
@CodingLane 2 жыл бұрын
Welcome!
@user-lv2pg5so2j
@user-lv2pg5so2j Жыл бұрын
nice video sir
@CodingLane
@CodingLane Жыл бұрын
Thank you!
@user-fz3ui4zp6h
@user-fz3ui4zp6h Жыл бұрын
I cought this logic due investing lots of time. However, I can't understand one aspect: In the full connected model: The model is received the "INPUT" we make the forward then to get the loss and based on it we fix each weight in the model using backpropagetion. During this process we never deal with "INPUT", because the "INPUT" could be different but we need to train our model to predict. As I realise in a convolutional model a kernel plays the same role as weight in the full connected model and during training our goal to configure the kernel for better prediction What I can't really understand why fix the INPUT data in the convolutional model convolutional model INPUT is alwais changed but kernel stay constantly. Based on this logic we need only to configure the kernel. Could you explain, please?
@John-wx3zn
@John-wx3zn 2 ай бұрын
thank you. why didn't you show the activation functions step?
@ASdASd-kr1ft
@ASdASd-kr1ft Жыл бұрын
why appear the sum in the chain rule for the derivati of a matrix respect to another one? you know someany source where i can figure out how this work? Thanks
@harryt5878
@harryt5878 Жыл бұрын
This is super useful thank you! However I am slightly stuck with finding dl/dk as for example in my neural network the previous layer has output 14x14x20 and the convolutional layer uses 20 3x3x20 filters so the dl/dk needs to be of size 3x3x20x20, but by applying the convolution(X, dl/dz) the output is 3x3x20 as dl/dz has size 12x12x20) how do I fix this?
@user-oq7ju6vp7j
@user-oq7ju6vp7j 7 ай бұрын
HI. Did you find out how to solve it? I have the same problem
@spyder2374
@spyder2374 Жыл бұрын
Ultra Nice explanation ... 👍 Q - why we update kernels while back propagation, kernels should be fixed right ? Let if a kernel is an vertical edge detector, after back propagation through it, what it will be ???
@CodingLane
@CodingLane Жыл бұрын
Hi, Kernels should not be fixed. Because you don't know which kernel is detecting what kind of features. The vertical edge detection was just an example to show that kernels detects edges, but in any model, we don't know which kernel is detecting what kind of edges. The model decides it itself. And that is why we update kernels through backpropagation, so that they automatically take appropriate values to identify features. Manually setting kernel values of so many kernels will be very tedious job. Let say, if you change your model and add new kernels, then you will have to set those values as well. Hope the answer helps. And sorry for this really late reply.
@arpit743
@arpit743 2 жыл бұрын
hi jay! can you please explain the Einstein summation convention part? My issue is that the convention says Mij= sigma k (Aik*Bkj), but in the video you mentioned dl/dkmn = sigma( dl/dzij * dzij/dkmn), is this equation consistent with Einstein convention? as it has both as ij ??
@CodingLane
@CodingLane 2 жыл бұрын
Hello... yea it is still consistent with Einstein convention. Convension just says that if you have ij appearing at cross position (numerator of one and denominator of other) then it implies to do summation over those terms.
@rahulkumarjha2404
@rahulkumarjha2404 Жыл бұрын
Great video. I just have one doubt, Why are we calculating dL/dX I mean using backpropogation we only update weights, biases and the terms in filter matrix. Please answer.
@CodingLane
@CodingLane Жыл бұрын
Hi, yes the end goal is to update weights and biases. dL/dX helps us calculating dL/dW and dL/dB of previous layer. Checkout 3:27 timestamp. dL/dX will become dL/dZ to the previous layer.
@siddhanthsridhar4742
@siddhanthsridhar4742 Жыл бұрын
hi at 15:30 equation of del L/del X12 is written wrong-it should be del L/del Z11*k11 insread of del L/del Z12*k11
@mounmountain141
@mounmountain141 Жыл бұрын
I would like to ask if there is any material to learn when encountering situation, when stride != 1 or have dilation
@CodingLane
@CodingLane Жыл бұрын
I found one article which showed that… but unfortunately I have lost it now. May be you can find it on google.
@mounmountain141
@mounmountain141 Жыл бұрын
​@@CodingLane OK, Thank you for this very helpful video.
@axitamohanta6743
@axitamohanta6743 Жыл бұрын
and what about delL/delZ
@ramchandhablani9834
@ramchandhablani9834 Жыл бұрын
He Jay at 3:34, I think you have written wrong equations Z11=X11K11+X12K12+X21K21+X22K22+B, B is a 2x2 matrix, you can not add to scalar Z11😣
@puchaharinathreddy5556
@puchaharinathreddy5556 Жыл бұрын
can you give us some example
@samueljohanes5219
@samueljohanes5219 Жыл бұрын
what is L?
@haidarrmehsen
@haidarrmehsen 9 ай бұрын
am I missing something or he didn't mention what is L?
@gamerx3582
@gamerx3582 2 ай бұрын
It's loss function
@surender_kovvuri
@surender_kovvuri 5 ай бұрын
bro can you provide the python code for this CNN model
@Jeffrey-uw8un
@Jeffrey-uw8un Жыл бұрын
MATHHHHHHHHHHHHHHHHHHHHH TOOOOOO MUCH MATTTTTTTTTTTTTHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH
@Jeffrey-uw8un
@Jeffrey-uw8un Жыл бұрын
BTW awensome video
@malanosi2869
@malanosi2869 4 ай бұрын
for gods sake please buy a mic i love ur lectures but can hear a damn thing in some clips
@CodingLane
@CodingLane 4 ай бұрын
Hey thanks. I have got one, will use it when create new videos 😊
@berwinamir5325
@berwinamir5325 Жыл бұрын
thank you a lot, this is really very very helpful 🤩🙌
@CodingLane
@CodingLane Жыл бұрын
You're welcome. Glad I could be of help! 😄🙂
Backpropagation in CNN  - PART 2
19:31
Coding Lane
Рет қаралды 24 М.
Always be more smart #shorts
00:32
Jin and Hattie
Рет қаралды 31 МЛН
Sigma Girl Past #funny #sigma #viral
00:20
CRAZY GREAPA
Рет қаралды 21 МЛН
Stupid Barry Find Mellstroy in Escape From Prison Challenge
00:29
Garri Creative
Рет қаралды 20 МЛН
터키아이스크림🇹🇷🍦Turkish ice cream #funny #shorts
00:26
Byungari 병아리언니
Рет қаралды 25 МЛН
Convolution Operation in CNN
10:58
Coding Lane
Рет қаралды 95 М.
CNN architecture | Explaining the Architecture of CNN
12:30
Coding Lane
Рет қаралды 91 М.
But what is a convolution?
23:01
3Blue1Brown
Рет қаралды 2,5 МЛН
Backpropagation in Convolutional Neural Networks (CNNs)
9:21
What is Keras and Tensorflow | Keras vs Tensorflow
3:41
Coding Lane
Рет қаралды 18 М.
Max Pooling in Convolutional Neural Network
7:36
Coding Lane
Рет қаралды 58 М.
Padding in Convolutional Neural Network
4:10
Coding Lane
Рет қаралды 64 М.
Always be more smart #shorts
00:32
Jin and Hattie
Рет қаралды 31 МЛН