1 x 1 Convolution (in Hindi)

  Рет қаралды 5,958

Kshitiz Verma

Kshitiz Verma

Күн бұрын

Пікірлер: 21
@Ajeet-Yadav-IIITD
@Ajeet-Yadav-IIITD 2 жыл бұрын
3:42 Shouldn't be n-f+1 ?
@KshitizVermaDL
@KshitizVermaDL 2 жыл бұрын
Thank you for pointing it out! It should be n-f+1.
@Omunamantech
@Omunamantech 3 ай бұрын
@@KshitizVermaDL if it is n-f -1 then it will be n - 1 - 1 = n - 2 = 26, am I right?
@rajshaikh3505
@rajshaikh3505 8 ай бұрын
This channel is extremely underrated. All explanations are really really amazing, cant emphasize enough. Please keep making more such content. Thank you so much
@KshitizVermaDL
@KshitizVermaDL 8 ай бұрын
Thank you so much for the kind words!
@samirz7
@samirz7 4 жыл бұрын
I can't believe I didn't discover this channel earlier. Simple explanation of difficult concepts. Thank you. Learning in hindi really clears the concept on another level :)
@KshitizVermaDL
@KshitizVermaDL 4 жыл бұрын
Thank you for this awesome comment! Such comments make me keep going!
@somdubey5436
@somdubey5436 4 жыл бұрын
very helpful video. I feel like you are gifted in making people easily understand complex things.
@harshjain4256
@harshjain4256 Жыл бұрын
If we use 1x1 filter how it will detect features? For example edges?
@raj-nq8ke
@raj-nq8ke 3 жыл бұрын
Simply speaking, 1X1 filter reduces the size of channels
@KshitizVermaDL
@KshitizVermaDL 2 жыл бұрын
You can use it to do so.
@kazimali4273
@kazimali4273 4 жыл бұрын
please tell me if we do not use relu function then whtas happen. I mean why we break linearity.
@KshitizVermaDL
@KshitizVermaDL 4 жыл бұрын
There are two videos explaining this! Have a look at them
@kazimali4273
@kazimali4273 4 жыл бұрын
@@KshitizVermaDL thanks I will reply after seeing your videos
@fierce10
@fierce10 2 жыл бұрын
If you keep layering linear functions, result will also be linear only, then all layers can be represented as single layer. But we are using layers to get a non-linear prediction. We are only using linearity within a layer because it is easy to compute transformations. But we break linearity between each layer.
@atiffaridi507
@atiffaridi507 3 жыл бұрын
jab koi image of size height x width x channel , single filter (eg. 3x3x3) ke sath convolve hoga to humein height-1 x width-1 x 1 image milega. Agar 16 filter ke sath convolve karenge to height-1 x width-1 x 16 size ka output milega. Plz correct at time: 2.41
@KshitizVermaDL
@KshitizVermaDL 3 жыл бұрын
I tried to check, couldn't find anything wrong! Can you check your calculations with a 1x1 filter?
@gender121
@gender121 4 жыл бұрын
A very good lecture but a slight doubt 28x28x192 * 5x5x192 (#32)=28x28x32 why not 28-5/1+1=23*23*32.Please explain Sir.Whether different padding has been used to make it uniform?
@KshitizVermaDL
@KshitizVermaDL 4 жыл бұрын
Thanks for the comment! You are right. Padding has been used. I have defined the meaning of "same" padding in one of the earlier videos. It means you do padding so as to keep the output dimensions same as input.
@eiesabyasachi
@eiesabyasachi 3 жыл бұрын
@@KshitizVermaDL Sir, As per my understanding, can be it calculated like this In case of normal scenario, 28x28x192 * 5x5x192 ( #filters = 32 ) = (28-5)/1 + 1 = 24x24x32 No of operations will be ( 28x28x192 ) * ( 5x5x32 ) = 120 M In case of 1:1 convolution scenario, 28x28x192 * 1x1x16 = 28x28x16 * 5x5x16 ( #filters = 32 ) = (28-5)/1 + 1 = 24x24x32 No of operations will be ( 28x28x192 ) * ( 1x1x16 ) = 2.4M No of operations will be ( 28x28x16 ) * ( 5x5x32 ) = 10M All total it will be 12.4M Please correct me, if the calculation is wrong.
@manishswami877
@manishswami877 4 жыл бұрын
thank you sir
Inception: GoogLeNet Architecture (in Hindi)
20:24
Kshitiz Verma
Рет қаралды 13 М.
When you have a very capricious child 😂😘👍
00:16
Like Asiya
Рет қаралды 18 МЛН
How Strong Is Tape?
00:24
Stokes Twins
Рет қаралды 96 МЛН
The evil clown plays a prank on the angel
00:39
超人夫妇
Рет қаралды 53 МЛН
Мясо вегана? 🧐 @Whatthefshow
01:01
История одного вокалиста
Рет қаралды 7 МЛН
But what is a neural network? | Deep learning chapter 1
18:40
3Blue1Brown
Рет қаралды 18 МЛН
Depthwise Separable Convolution - A FASTER CONVOLUTION!
12:43
CodeEmporium
Рет қаралды 97 М.
Convolutional Neural Networks Explained (CNN Visualized)
10:47
Futurology — An Optimistic Future
Рет қаралды 261 М.
1x1 Convolution - lecture 63/ machine learning
21:58
asha khilrani
Рет қаралды 10 М.
What is Convolutional Neural Network (CNN) | CNN Intution
27:10
ResNet Architecture: Part 1 (in Hindi)
14:52
Kshitiz Verma
Рет қаралды 16 М.
Convolutional Neural Networks from Scratch | In Depth
12:56
Inception Network | Inception Module | InceptionV1
16:10
Code With Aarohi
Рет қаралды 26 М.
C4W2L05 Network In Network
6:40
DeepLearningAI
Рет қаралды 110 М.
When you have a very capricious child 😂😘👍
00:16
Like Asiya
Рет қаралды 18 МЛН