Code for Deep Learning - ArgMax and Reduction Tensor Ops

  Рет қаралды 28,042

deeplizard

deeplizard

Күн бұрын

Пікірлер
@deeplizard
@deeplizard 6 жыл бұрын
Who's seen the movie IT? If you've seen it, tell us: 1) What country did you see it in? 2) On a scale from 0% - 100%, what's your rating of the movie? Check out the corresponding blog and other resources for this video at: deeplizard.com/learn/video/K3lX3Cltt4c
@sourajitsaha4724
@sourajitsaha4724 6 жыл бұрын
Bangladesh, 95%
@victortarnovskiy8407
@victortarnovskiy8407 6 жыл бұрын
Russia, 90%. It is truly a masterpiece.
@rajeshviky
@rajeshviky 6 жыл бұрын
Finland 83%
@林盟政
@林盟政 5 жыл бұрын
Taiwan, IT version 85%, element-wise version 120%
@min-youngchoi3833
@min-youngchoi3833 4 жыл бұрын
South Korea / 100% and might be overflow
@thepresistence5935
@thepresistence5935 2 жыл бұрын
I literally wasted more than 6 hours watching this continuously and successfully completed 11 video. It's like watching a movie :) Thanks a lot, I am understanding everything.
@ioannis.g.tzolas
@ioannis.g.tzolas 6 жыл бұрын
Congratulations on the terrific content and style of presentation. Your videos are like watching a TV series (with tensors as heroes). Can't stop watching !!! Congratulations !
@deeplizard
@deeplizard 6 жыл бұрын
Wow! Tensors as super heros! Love it! Thank you! 📺
@mohammadsadilkhan1875
@mohammadsadilkhan1875 5 жыл бұрын
Correct me if I am wrong. This is the most interesting as well as a great resource for learning. Your idea of putting videos of pioneers in this field is truly marvellous.
@hzmuhabbet
@hzmuhabbet 2 жыл бұрын
I am happy to find that channel, there's a great work here mate. Thanks for your efforts.
@stormzrift4575
@stormzrift4575 7 ай бұрын
This has been fantastic so far. The memes make it so much more engaging
@user-or7ji5hv8y
@user-or7ji5hv8y 6 жыл бұрын
Really finding this series rewarding. very much motivated to learn PyTorch.
@deeplizard
@deeplizard 6 жыл бұрын
Thank you, James! I'm glad to hear that!
@philtrem
@philtrem 6 жыл бұрын
same !
@Brahma2012
@Brahma2012 5 жыл бұрын
This is one of the best trainings on PyTorch. The coverage is highly relevant and exhaustive. Thank you
@bisnar1307
@bisnar1307 4 жыл бұрын
Thank you for this series. It's extremely helpful. Please continue teaching people more!
@techfirsttamil
@techfirsttamil 6 жыл бұрын
No words to thank you guys. Always raising bar of quality content about deep learning. Heartly thanks from India
@deeplizard
@deeplizard 6 жыл бұрын
Thank you, Venkatesh! Appreciate that!
@_edwardedberg
@_edwardedberg 5 жыл бұрын
Dude.. your videos should be the world standard of neural network education.
@deeplizard
@deeplizard 5 жыл бұрын
Thank you, Edward!
@harishjulapalli448
@harishjulapalli448 4 жыл бұрын
Great Video. I got the concepts very clear now. Thanks a lot for your efforts.
@sumanmondal2152
@sumanmondal2152 4 жыл бұрын
You're way of teaching is amazing !!
@Fighter_Believer_Achiever
@Fighter_Believer_Achiever 6 ай бұрын
Bro brilliant work. Legendary!!
@kyle_bro
@kyle_bro 4 жыл бұрын
This really is the highest level of content. 10/10
@DanielWeikert
@DanielWeikert 6 жыл бұрын
Love you guys. Great video again. Highly appreciated.
@deeplizard
@deeplizard 6 жыл бұрын
Thanks Daniel! 🚀
@Gee.U
@Gee.U 5 жыл бұрын
this channel is a blessing, thank you!
@haroldsu1696
@haroldsu1696 4 жыл бұрын
really awesome lectures.
@saptarshidattaaot
@saptarshidattaaot 4 жыл бұрын
Will never ever forget element-wise operation
@ferielferiel5006
@ferielferiel5006 5 жыл бұрын
OOOh my god !!! I was watching this video late at night !! and you hell scared me
@deeplizard
@deeplizard 5 жыл бұрын
🤣🤣🤣🤣🤣
@hamedhojatian3098
@hamedhojatian3098 5 жыл бұрын
Perfect! Thank you!
@deeplizard
@deeplizard 5 жыл бұрын
Hey Hamed - You are welcome!
@shanthalvasanth7782
@shanthalvasanth7782 4 жыл бұрын
It makes sense that with t being a 3x4 tensor, the output of t.sum(dim=0) has the shape (1x4) but I don't get why the output tensor resulting from t.sum(dim=1) does not have the shape (3x1) ? It is unintuitive that sum over dim=1 yields a 1x3 tensor. EDIT : Just looked up Pytorch documentation - apparently you need to pass 'keepdim=True' to retain the dimensions and get a (3x1) tensor when summing over dim=1.
@herrmann5384
@herrmann5384 5 жыл бұрын
I dont get, why the order is changed, when you set a number for "dim": For example: t = torch.tensor([ [1,1,1,1], [2,2,2,2], [3,3,3,3] ]) print(t.shape) = [3,4] = 3 = first rows, 4 = then columns . But when you use the sum method: print(t.sum(dim = 0)) = tensor([6, 6, 6, 6]) dim = 0 : adds the columns and dim = 1 the rows.... why is it the other way around? Thanks so much!
@deeplizard
@deeplizard 5 жыл бұрын
Thinking in terms of rows and columns can be problematic when dealing with tensors because they are more abstract data structures. Instead of saying we have 3 rows and 4 columns. Let's say we have 3 things and for each of these we have 4 things. Where are the 3 things? dim=0 What are the 3 things? rank-1 tensors (arrays) of length 4 How do we sum rank-1 tensors together? via element-wise addition The last part (element-wise) is the key. When we take the sum with respect to dim=0, we are summing the elements that run on that dimension. If these elements are not numbers, the sum is an element-wise sum. Since each array has a length of 4 (4 indicies), the result has a length of 4. The same logic applies to dim=1. What are the 4 things. numbers How many groups of 4 things do we have? 3 This means we get 3 sums of 4 things. This idea extends to higher rank tensors as well. Let me know if this helps.
@reb5642
@reb5642 5 жыл бұрын
Thank you so much sir. Your video is so great
@deeplizard
@deeplizard 5 жыл бұрын
Hey REB - Thank you!
@thepresistence5935
@thepresistence5935 2 жыл бұрын
4:53 best moment :)
@elibelash8238
@elibelash8238 6 жыл бұрын
Perfect, thank you.
@deeplizard
@deeplizard 6 жыл бұрын
Hey Eli - You are welcome!
@ilfat_khairullin
@ilfat_khairullin 4 жыл бұрын
Thank you!
@tingnews7273
@tingnews7273 6 жыл бұрын
I thought I have figure out the dim torch.sum(dim=0)。But when I see the torch.argmax(dim=0)。I confused again。 It appears to me the logic is oppersite about this two. When go to the nd-array. Things seem to go more complex。Is there any way I can get it crystal
@deeplizard
@deeplizard 6 жыл бұрын
你好刘新新 - The key is to remember that the reduction is element-wise unless we are working on the last dim. The last dim is always just an array of numbers (the scalar components). Suppose we have this tensor: t = [ [1,2,3,4], [0,0,0,0] ] How do we calculate t.sum(dim=0)? We do it by summing corresponding elements across dim-0. There exist two elements running along dim-0. This one: [1,2,3,4] And this one: [0,0,0,0] Since these are arrays, the sum must be element-wise. This gives us the following: [1+0,2+0,3+0,4+0] = [1,2,3,4] Now, with the argmax(dim=0) operation, it's the same only the operation is different. This time the operation is not sum, but argmax. Argmax gives us the index location of the max value. We have two possible index values, 0 or 1. This is because we are working with two elements. t[0] = [1,2,3,4] t[1] = [0,0,0,0] When we do the argmax, we argmax corresponding elements (element-wise) across dim-0. [argmax(1,0), argmax(2,0), argmax(3,0), argmax(4,0)] = [0,0,0,0] Let me know if this helps and if you still have questions. I think the best way to make this crystal is to create some random examples and practice working them out. It is tricky.
@tingnews7273
@tingnews7273 6 жыл бұрын
@@deeplizard Thank you for your reply. I spend some time to watch the video again and again ,read your reply and try it in code and calculate by pen. I think I get some intuition . But not that clear. I take it as a little progress . I think the most important thing to understand it is element-wise. base different dim . different element. I can figure out the argmax(dims=0) , use this perspective is element is a tensor ,so when we go through every scalar in the tensor , we ask the index of the tensor element. I found in the deep learning , if you want to understand more . you must get the basic ideas clear. I have doing the job for about 1 year more. Still so many basic concepts not that clear. Thank you very much.
@Nikage23
@Nikage23 5 жыл бұрын
I wish I'd found this explanations earlier
@rubencg195
@rubencg195 6 жыл бұрын
Awesome!
@deeplizard
@deeplizard 6 жыл бұрын
Thanks Ruben! Really appreciate your comments!
@marwaneboudraa9917
@marwaneboudraa9917 2 жыл бұрын
best explanations ever
@donfeto7636
@donfeto7636 2 жыл бұрын
love this
@AnmolGargmed
@AnmolGargmed 6 жыл бұрын
at 3:40, isn`t that rank 1 matrix ?
@deeplizard
@deeplizard 6 жыл бұрын
Hey Anmol - The tensor t at 3:40 is a rank 2 tensor. There are two indices required to access an element inside t. Refer here: deeplizard.com/learn/video/AiyK0idr4uM If it still isn't clear, let me know more about how you are thinking about it. Hope this helps.
@gactve2110
@gactve2110 4 жыл бұрын
I would drop the bassy sound effect when you type, it's anoyying when usinf earphones. Except that, Great content! Keep it up!
@deeplizard
@deeplizard 4 жыл бұрын
Hey gactve - I appreciate the straight forward way you described the issue with the sound effect. Thanks! That sound has been removed in newer releases. 😊
@heller4196
@heller4196 5 жыл бұрын
4:30 LMAO
@deeplizard
@deeplizard 5 жыл бұрын
Hey Omkar - 🤣 Thanks for your comments!
@takeyoshix
@takeyoshix 4 жыл бұрын
"You gotta step up your game, mate!"... care to share the link to that hilarious woman?
@deeplizard
@deeplizard 4 жыл бұрын
Link to her video in the description :D (Name: Sorelle)
@onePunch95
@onePunch95 5 жыл бұрын
tensor([1, 2, 3, 4]) >>> t.sum(dim=1) tensor([10, 0]) In this case, why don't we get [10]?
@deeplizard
@deeplizard 5 жыл бұрын
Hi Satwik - I can't be sure because I cannot see the value of the tensor t. Note that the tensor shown tensor([1,2,3,4]) is a single dimensional tensor.
@ennardzhao1871
@ennardzhao1871 5 жыл бұрын
stop putting scary clips into your tutorial videos!!!!
@jacoblawrence1138
@jacoblawrence1138 5 жыл бұрын
the sound you have added when typing the code i mean like a bhooo...... kind of sound is really annoying and it gets irritated when watching the videos for continuously. just take this as a suggestion from a viewer.
@deeplizard
@deeplizard 5 жыл бұрын
Thanks Jacob. Are you using headphones? Which ones?
Dataset for Deep Learning - Fashion MNIST
16:04
deeplizard
Рет қаралды 29 М.
Creating PyTorch Tensors for Deep Learning - Best Options
11:02
deeplizard
Рет қаралды 37 М.
BAYGUYSTAN | 1 СЕРИЯ | bayGUYS
36:55
bayGUYS
Рет қаралды 1,9 МЛН
UFC 310 : Рахмонов VS Мачадо Гэрри
05:00
Setanta Sports UFC
Рет қаралды 1,2 МЛН
Cat mode and a glass of water #family #humor #fun
00:22
Kotiki_Z
Рет қаралды 42 МЛН
Build PyTorch CNN - Object Oriented Neural Networks
23:23
deeplizard
Рет қаралды 51 М.
CUDA Explained - Why Deep Learning uses GPUs
13:33
deeplizard
Рет қаралды 246 М.
Rank, Axes, and Shape Explained - Tensors for Deep Learning
10:04
Tensors Explained - Data Structures of Deep Learning
6:06
deeplizard
Рет қаралды 106 М.
CNN Weights - Learnable Parameters in PyTorch Neural Networks
23:51
BAYGUYSTAN | 1 СЕРИЯ | bayGUYS
36:55
bayGUYS
Рет қаралды 1,9 МЛН