ResNets are tricky to conceptualise as there are many nuances to consider. Dr Bryce, you have done a great job here offering such a brilliant explanation that is both logical and easy to follow. You definitely have a gift of explaining complex ideas. Thank you!
@anirudhsarma937 Жыл бұрын
very very very good explanation. almost all explanations on this forget about the influence of random weights on the forward propagation and focus solely on the backward gradient multiplication. which is why i never understood why you needed to feed forward the input. thanks a lot
@ashishbhong59016 ай бұрын
i have seen a lot of online lectures but you are the best for two reasons, the way you speak is not monotonous which give time to comprehend and process what your are explaining, and the second is the effort put in video editing to speed up when writing things down on board which doesn't break the flow of the lecture. Liked your video. Thanks🙂!
@vernonmascarenhas1801Ай бұрын
I am writing a thesis on content-based image retrieval and I had to understand the ResNet architecture in-depth and by far this is the most transparent explanation ever!!
@AdityaSingh-qk4qe3 ай бұрын
This is the clearest video that I've ever seen which explains the resnet for a layman, while at the same time conveying all the very important and relevant information related to resnet - I couldn't understand the paper - but with this video finally understood it - thanks a lot Professor Bryce - hope you create more such videos on deep learning
@thelife5628Ай бұрын
Another example of a random youtuber with very less subscriber explaining a complex topic so brilliantly... Thankyou so much sir
@Engrbrain Жыл бұрын
I am going to complete the entire playlist. Thanks, Bryce, you are a life saver
@zhen_zhong2 ай бұрын
This tutorial is so clear that I can follow along as a non-native English speaker. Thanks a lot!
@alissabrave424Ай бұрын
Brilliant explanation! Thank you so much, Professor Bryce!
@lallama2024 ай бұрын
Love your explanation, very easy to understand the concept and the flow of the ResNet in 17 mins! Really appreciate it
@garydalley23492 ай бұрын
Awesome explanation. Got me through a learning hurdle that several others could not.
@giordano_vitale5 ай бұрын
Every single second of this video conveys an invaluable amount of information to properly understand these topics. Thanks a lot!
@shobhitsrivastava911210 ай бұрын
Until now, this is the best Residual Network tutorial I have found. As constructive feedback, I would like you to dive more deeply into how shape mismatches are handled because that part is not at par with the rest of the highly intuitive explanations of various things happening in a ResNet.
@subramanianiyer33006 ай бұрын
Thank you Prof. Bruce for explaining this thing with minimal complicated technicality
@nguyentranconghuy696529 күн бұрын
nice explanation, thank you very much Professor Bryce
@raulpena986510 ай бұрын
Thank you professor Bryce, Resnets where brilliantly explained by you. I am looking forward for new videos on more recent deep learning architectures!
@user-ux2gz7sm6z11 ай бұрын
your explanation is clear and concise! Thank you so much
@rishabhagarwal4702Ай бұрын
Brilliant explanation, the 3D diagrams were excellent and I could understand some tricky concepts, thank you so much!
@luisaruquipac.3812 күн бұрын
Awesome explanation! Thanks a lot.
@rohithr20712 ай бұрын
Best explanation i came across resnet so far.
@kindness_mushroom5 ай бұрын
Thank you for the clear, concise, yet comprehensive explanation!
@ali575553 ай бұрын
Thank you very much for putting the time and effort. This is one of the best explanations I've seen (including US uni. professors)
@nilishamp245 Жыл бұрын
you are brilliant!! Thank you for explaining this so well!!!!❤❤❤
@abdulsaboorkhan83374 ай бұрын
Thank you so much Mr Bryce.
@user-ol1dx3nb3d5 ай бұрын
Brilliant explanation. Thank you!
@minkijung310 ай бұрын
Amazing. Thanks a lot. Your explanation is so clear. Please keep making videos professor!🙏
@lalop4258 Жыл бұрын
Excellent class! I watched many videos before I came to this video and none explained the concept of residual networks as clearly as you did. Greetings from México!
@strictly-ai3 ай бұрын
Best explanation of resnet on the internet
@sanjeevjangra84Ай бұрын
So clear and well explained. Thank you!
@jonathanzkoch Жыл бұрын
Great video on this, super informative.
@genericchannel858911 ай бұрын
Awesome explanation!! Thank you for your effort :)
@nikhilthapa93008 ай бұрын
Your explanations are very clear and well structured. Please never stop teaching.
@rabindhakal3 ай бұрын
You have my respect, Professor.
@schmiede19988 ай бұрын
Thank you so much for this video!
@vaibhavnakrani29837 ай бұрын
awesome.Loved it clear and concise!
@MrMiguelDonate2 ай бұрын
Brilliant explanation!!!
@business_central Жыл бұрын
Omg this is so helpful! Thank you so much !!!
@ArtJug Жыл бұрын
Wow This explanation is amazing. So clear! I saw some videos about resNets but none of them describes what skip connections mean inside, what is their inside structure and working logic. But your explanation gives me much more. You explained the way of thinking and inside structure and advantages. Wow!
@beatbustersindia36416 ай бұрын
Brilliant explanation.
@rhysm81676 ай бұрын
this was fantastic - thank you
@user-bg2vs5kh6n5 ай бұрын
Great explanation, congrats.
@jiaqint961Ай бұрын
Thanks for your video.
@1991liuyangyang2 ай бұрын
great explanation, simple and straightforward.
@user-rb7vn3lt8t10 ай бұрын
Really Great explanation. Thanks Prof. ♥
@sam-vv6gl3 ай бұрын
thank you for the great explanation
@user-hd3uv9ym7f7 ай бұрын
Thanks so much! very informative brief explanation
@user-yv3ib9so5dАй бұрын
What an explanation
@Bachelorarbeit-op4he6 ай бұрын
great explanation, thank you!
@adityabhatt41735 ай бұрын
Great Explanation !!!!
@user-uq7kc2eb1i6 ай бұрын
Very nice video!
@AymanFakri-ou8ro5 ай бұрын
very nice! thank you!
@bakhoinguyen51567 ай бұрын
Thank you!!!
@puyushgupta17685 ай бұрын
16 golden minutes.❤
@sajedehtalebi902 Жыл бұрын
It was clear and useful. Tnx a lot
@sashimiPv5 ай бұрын
Prof. Bryce is the GOAT!
@swethanandyalaАй бұрын
Amazing expalinaton. Thank you sir
@sharmashikhashikha311 ай бұрын
You are a star!
@charlesd4572 Жыл бұрын
Superb!
@SatyamAnand-ow4ub11 ай бұрын
Awesome explanation
@axe8636 ай бұрын
Loss landscape looking super smooth .....
@happyvioloniste089 ай бұрын
Thank you 👏👏
@amitabhachakraborty497 Жыл бұрын
Best Explanation
@wouladjecabrelwen10068 ай бұрын
Who is this teacher? Damn he is good. Thank you
@lovenyajain60265 ай бұрын
Waow. Thankyou
@zanzmeraankit48209 ай бұрын
got a meaningfull insights from this video
@paulocezarcunhaАй бұрын
great!
@kkjun7157 Жыл бұрын
This is such a clean and helpful video! Thank you very much! The only thing I still don't know is during the propagation, we now have two sets of gradients for each block? One for going through the layers, one for going around the layers, then how do we know which one to use to update the weights and bias?
@csprof Жыл бұрын
Good question. For any given weight (or bias), its partial derivative expresses how it affects the loss along *all* paths. That means we have to use both the around- and through-paths to calculate the gradient. Luckily, this is easy to compute because the way to combine those paths is just to add up their contributions!
@AsilKhalifa20 сағат бұрын
Thanks
@kranthikumar999810 ай бұрын
@csprof, By consistently including the original information alongside the features obtained from each residual block, are we inadvertently constraining our ResNet model to closely adhere to the input data, possibly leading to a form of over-memorization?
@wege84093 ай бұрын
10:10 Concerns: shape mis-match nervous sweating
@newbie805111 ай бұрын
Coudn't understand how we can treat the shape-mismatch 13:40 Great lecture nonetheless, thank you sir !! Understood what Residual Networks are 🙏
@anirudhsarma937 Жыл бұрын
Can you please talk about GANs and if possible stable diffusion
@mohammadyahya78 Жыл бұрын
Thank you very much. I am not sure yet how residual block lead to faster gradient passing when the gradient has to go through both paths please? It means as I understand that this adds more overhead to compute the gradient. Please correct me if I am wrong. Also can you please add more how 1x1 reduce the depth or make a video please if possible? For example, I am not sure how the entire depth say of size 255 gives output to one neuron.
@csprof Жыл бұрын
You're right that the residual connections mean more-complicated gradient calculations, which are therefore slower to compute for one pass. The sense in which it's faster is that it takes fewer training iterations for the network to learn something useful, because each update is more informative. Another way to think about it is that the function you're trying to learn with a residual architecture is simpler, so your random starting point is a lot more likely to be in a place where gradient descent can make rapid downhill progress. For the second part of your question, whenever we have 2D convolutions applied to a 3D tensor (whether the third dimension is color channels in the initial image, or different outputs from a preceding convolutional layer) we generally have a connection from *every* input along that third dimension to each of the neurons. If you do 1x1 convolution, each neuron gets input from a 1x1 patch in the first two dimensions, so the *only* thing it's doing is computing some function over all the third-dimension inputs. And then by choosing how many output channels you want, you can change the size on that dimension. For example, say that you have a 20x20x3 image. If you use 1x1 convolution with 8 output channels, then each neuron will get input from a 1x1x3 sub-image, but you'll have 8 different functions computed on that same patch, resulting in a 20x20x8 output.
@user-bw3bv1nz9l Жыл бұрын
👍
@rayananwar810615 күн бұрын
Do you mean that RESNET is just a skip connection not an individual network ?????????