C4W1L09 Pooling Layers

  Рет қаралды 161,805

DeepLearningAI

DeepLearningAI

Күн бұрын

Пікірлер: 35
@danielregassa9805
@danielregassa9805 4 жыл бұрын
For anyone wondering why average pooling isn't used often, it's because its functionality can be easily replicated by a filter with all elements = 1/n where n is the number of elements in the filter
@ekayesorko
@ekayesorko 3 жыл бұрын
thanks man.
@WahranRai
@WahranRai 2 жыл бұрын
But backpropagation is in favor of pooling (no backpropagation)
@ah-rdk
@ah-rdk 6 ай бұрын
good catch. also you can argue n=f*f.
@jimmiemunyi
@jimmiemunyi 3 жыл бұрын
4 years later. Thank youuuuuuuu
@rajkkapadia
@rajkkapadia 6 жыл бұрын
Amazing explanation sir.
@batoulZreik
@batoulZreik 8 ай бұрын
I have test and I am very grateful 🙏
@safi2297
@safi2297 6 жыл бұрын
it's really useful and easy to understand thanks for the video. keep it up the good work.
@arkanandi8806
@arkanandi8806 2 жыл бұрын
Pooling layers incorporates to a certain extent spatial invariance. It would be really great if you can just describe why and how!
@inquisitiverakib5844
@inquisitiverakib5844 2 жыл бұрын
awesome content. I've a question if in the pooling layer no learning occurs then what is the need to do pooling
@흑룡-d6n
@흑룡-d6n 2 жыл бұрын
교수님. 컨볼루션 레이어 후에 값을 일자로 재배열후에 fc layer로 넣는건 알겠습니다. 그후 몇개의 층을거쳐 개수를 좀더 줄인후에 softmax하여 classify하는것도 알겠습니다. 그후 역전파는 어떻게 하는건지요? softmax하여 나온 값부터 어떤 기준을 가지고 역전파를 시작하여 fc layer를 거쳐 어떤식으로 convolution layer의 필터에 가중치를 적용하는지 그 과정이 생략되어있어 이해가 어렵습니다.
@Vinoth89Karur
@Vinoth89Karur 4 жыл бұрын
Awesome sir.. Thank you so much..
@ervinperetz5973
@ervinperetz5973 2 жыл бұрын
Why do the number of channels double in AlexNet and VGG-19 ? Supposedly it's because overlapped pooling is used. But it's not clear how the extra channels are formulated. (e.g. for 2x2 overlapped pooling, presumably with stride 1 in both directions, width and height are halved (unlike in your overlapped pooling example), and #channels doubles; that doesn't add up wrt the number of pooling operations).
@ati43888
@ati43888 8 ай бұрын
thanks
@sandipansarkar9211
@sandipansarkar9211 3 жыл бұрын
nice explanation
@davidtorres5012
@davidtorres5012 3 жыл бұрын
Thanks a lot !
@juanandreslopezcubides5626
@juanandreslopezcubides5626 2 жыл бұрын
If I have a dimension of 11x11 and a maxpool2d of 3, according to the formula it would be 9, but in Keras it says 3, why?
@fariman-kashani
@fariman-kashani 6 жыл бұрын
Thanks! so useful.
@nikilragav
@nikilragav 4 ай бұрын
isn't average pooling actually a convolution -> box convolution
@fatimahmath4819
@fatimahmath4819 5 жыл бұрын
thank u very much sir
@ahmeedhmdi7874
@ahmeedhmdi7874 11 ай бұрын
@MrQwerty2524
@MrQwerty2524 5 жыл бұрын
So, does this formula mean that we substract 0.5 when dealing with decimals?
@adityaachmad2265
@adityaachmad2265 5 жыл бұрын
anyone know about backward pooling ?
@codingtheworld674
@codingtheworld674 3 жыл бұрын
kzbin.info/www/bejne/jnaWnKWcaKiEotU After 2 Years :), but maybe someone also wants to look.
@anandinamdar4054
@anandinamdar4054 6 жыл бұрын
is maxpooling differentiable?
@timharris72
@timharris72 6 жыл бұрын
No there aren't any learnable parameters
@NisseOhlsen
@NisseOhlsen 5 жыл бұрын
yes, it is differentiable, but unless you made the size of the pooling window a parameter you wouldn't get anything out of differentiating. Example: If you have a 3 by 3 filter you have n = 9 parameters in your filter. so the max pooling operation would for once instance of a stride be f(x) = Sum(x)/n, where Sum(x) means the sum of all n elements. Differentiating with respect to x would give you Sum(1/n), which doesn't help you since x is not our parameter and this is not what we are trying to train. Differentiating with respect to n, allowing n to be trainable, would give you -n^-2. So IF we allow the pooling filter size to be trainable, the YES, max pooling is both differentiable AND usefully so. However, this is seemingly not used (although I'm not sure why).
@shubhamchandra9258
@shubhamchandra9258 4 жыл бұрын
The entire neural network as a whole is differentiable. That can't happen if the pooling layer wasn't differentiable. Differentiable means for every small change in input there is a small change in output and not abrupt change.
@FasstEddie
@FasstEddie Жыл бұрын
is it just me or does the matrix glitch multiple times in this video?
@sjl4554
@sjl4554 5 жыл бұрын
underlining reason of max pooling??
@chaitanyag6297
@chaitanyag6297 5 жыл бұрын
Max pooling, which is a form of down-sampling is used to identify the most important features. Means we are just taking the prominent edge/feature n that part , after conv layer edges will have a high positive number , so when you take the highest value in a part , you are looking at the edge/feature which is dominating others and more distinguishing , this has the advantage of downsizing our data for the dense layer to have less connections while taking the important features(leaving the less dominant features behind)
@abishekseshan1757
@abishekseshan1757 4 жыл бұрын
Chaitanya G But what is the guarantee that the pixel with highest value is most important? How can we determine that?
@oscarw1976
@oscarw1976 4 жыл бұрын
@@abishekseshan1757 max pooling isn't necessarily applied to pixels, it can be applied to a layer of neuron outputs
@rohitborra2507
@rohitborra2507 4 жыл бұрын
@@chaitanyag6297 thanks bro
@snippletrap
@snippletrap 4 жыл бұрын
@@abishekseshan1757 there is no guarantee but it works well in practice
C4W1L10 CNN Example
11:40
DeepLearningAI
Рет қаралды 126 М.
C4W1L07 One Layer of a Convolutional Net
16:11
DeepLearningAI
Рет қаралды 166 М.
SIZE DOESN’T MATTER @benjaminjiujitsu
00:46
Natan por Aí
Рет қаралды 5 МЛН
快乐总是短暂的!😂 #搞笑夫妻 #爱美食爱生活 #搞笑达人
00:14
朱大帅and依美姐
Рет қаралды 13 МЛН
СКОЛЬКО ПАЛЬЦЕВ ТУТ?
00:16
Masomka
Рет қаралды 3,6 МЛН
C4W1L08 Simple Convolutional Network Example
8:35
DeepLearningAI
Рет қаралды 195 М.
C4W3L07 Nonmax Suppression
8:02
DeepLearningAI
Рет қаралды 97 М.
C4W1L06 Convolutions Over Volumes
10:45
DeepLearningAI
Рет қаралды 240 М.
Recurrent Neural Networks (RNNs), Clearly Explained!!!
16:37
StatQuest with Josh Starmer
Рет қаралды 602 М.
C4W3L08 Anchor Boxes
9:43
DeepLearningAI
Рет қаралды 140 М.
C4W1L04 Padding
9:50
DeepLearningAI
Рет қаралды 156 М.
C4W1L05 Strided Convolutions
9:02
DeepLearningAI
Рет қаралды 144 М.
C4W3L09 YOLO Algorithm
7:02
DeepLearningAI
Рет қаралды 226 М.
C4W2L04 Why ResNets Work
9:13
DeepLearningAI
Рет қаралды 144 М.
SIZE DOESN’T MATTER @benjaminjiujitsu
00:46
Natan por Aí
Рет қаралды 5 МЛН