Code for Deep Learning - ArgMax and Reduction Tensor Ops

  Рет қаралды 27,958

deeplizard

deeplizard

Күн бұрын

Пікірлер: 69
@deeplizard
@deeplizard 6 жыл бұрын
Who's seen the movie IT? If you've seen it, tell us: 1) What country did you see it in? 2) On a scale from 0% - 100%, what's your rating of the movie? Check out the corresponding blog and other resources for this video at: deeplizard.com/learn/video/K3lX3Cltt4c
@sourajitsaha4724
@sourajitsaha4724 5 жыл бұрын
Bangladesh, 95%
@victortarnovskiy8407
@victortarnovskiy8407 5 жыл бұрын
Russia, 90%. It is truly a masterpiece.
@rajeshviky
@rajeshviky 5 жыл бұрын
Finland 83%
@林盟政
@林盟政 5 жыл бұрын
Taiwan, IT version 85%, element-wise version 120%
@min-youngchoi3833
@min-youngchoi3833 4 жыл бұрын
South Korea / 100% and might be overflow
@thepresistence5935
@thepresistence5935 2 жыл бұрын
I literally wasted more than 6 hours watching this continuously and successfully completed 11 video. It's like watching a movie :) Thanks a lot, I am understanding everything.
@ioannis.g.tzolas
@ioannis.g.tzolas 6 жыл бұрын
Congratulations on the terrific content and style of presentation. Your videos are like watching a TV series (with tensors as heroes). Can't stop watching !!! Congratulations !
@deeplizard
@deeplizard 6 жыл бұрын
Wow! Tensors as super heros! Love it! Thank you! 📺
@mohammadsadilkhan1875
@mohammadsadilkhan1875 5 жыл бұрын
Correct me if I am wrong. This is the most interesting as well as a great resource for learning. Your idea of putting videos of pioneers in this field is truly marvellous.
@user-or7ji5hv8y
@user-or7ji5hv8y 6 жыл бұрын
Really finding this series rewarding. very much motivated to learn PyTorch.
@deeplizard
@deeplizard 6 жыл бұрын
Thank you, James! I'm glad to hear that!
@philtrem
@philtrem 5 жыл бұрын
same !
@stormzrift4575
@stormzrift4575 5 ай бұрын
This has been fantastic so far. The memes make it so much more engaging
@hzmuhabbet
@hzmuhabbet 2 жыл бұрын
I am happy to find that channel, there's a great work here mate. Thanks for your efforts.
@_edwardedberg
@_edwardedberg 5 жыл бұрын
Dude.. your videos should be the world standard of neural network education.
@deeplizard
@deeplizard 5 жыл бұрын
Thank you, Edward!
@Brahma2012
@Brahma2012 5 жыл бұрын
This is one of the best trainings on PyTorch. The coverage is highly relevant and exhaustive. Thank you
@techfirsttamil
@techfirsttamil 6 жыл бұрын
No words to thank you guys. Always raising bar of quality content about deep learning. Heartly thanks from India
@deeplizard
@deeplizard 6 жыл бұрын
Thank you, Venkatesh! Appreciate that!
@bisnar1307
@bisnar1307 4 жыл бұрын
Thank you for this series. It's extremely helpful. Please continue teaching people more!
@saptarshidattaaot
@saptarshidattaaot 4 жыл бұрын
Will never ever forget element-wise operation
@kyle_bro
@kyle_bro 4 жыл бұрын
This really is the highest level of content. 10/10
@sumanmondal2152
@sumanmondal2152 4 жыл бұрын
You're way of teaching is amazing !!
@grandson_f_phixis9480
@grandson_f_phixis9480 4 ай бұрын
Bro brilliant work. Legendary!!
@harishjulapalli448
@harishjulapalli448 4 жыл бұрын
Great Video. I got the concepts very clear now. Thanks a lot for your efforts.
@Gee.U
@Gee.U 5 жыл бұрын
this channel is a blessing, thank you!
@DanielWeikert
@DanielWeikert 6 жыл бұрын
Love you guys. Great video again. Highly appreciated.
@deeplizard
@deeplizard 6 жыл бұрын
Thanks Daniel! 🚀
@thepresistence5935
@thepresistence5935 2 жыл бұрын
4:53 best moment :)
@haroldsu1696
@haroldsu1696 4 жыл бұрын
really awesome lectures.
@ferielferiel5006
@ferielferiel5006 5 жыл бұрын
OOOh my god !!! I was watching this video late at night !! and you hell scared me
@deeplizard
@deeplizard 5 жыл бұрын
🤣🤣🤣🤣🤣
@Nikage23
@Nikage23 4 жыл бұрын
I wish I'd found this explanations earlier
@herrmann5384
@herrmann5384 5 жыл бұрын
I dont get, why the order is changed, when you set a number for "dim": For example: t = torch.tensor([ [1,1,1,1], [2,2,2,2], [3,3,3,3] ]) print(t.shape) = [3,4] = 3 = first rows, 4 = then columns . But when you use the sum method: print(t.sum(dim = 0)) = tensor([6, 6, 6, 6]) dim = 0 : adds the columns and dim = 1 the rows.... why is it the other way around? Thanks so much!
@deeplizard
@deeplizard 5 жыл бұрын
Thinking in terms of rows and columns can be problematic when dealing with tensors because they are more abstract data structures. Instead of saying we have 3 rows and 4 columns. Let's say we have 3 things and for each of these we have 4 things. Where are the 3 things? dim=0 What are the 3 things? rank-1 tensors (arrays) of length 4 How do we sum rank-1 tensors together? via element-wise addition The last part (element-wise) is the key. When we take the sum with respect to dim=0, we are summing the elements that run on that dimension. If these elements are not numbers, the sum is an element-wise sum. Since each array has a length of 4 (4 indicies), the result has a length of 4. The same logic applies to dim=1. What are the 4 things. numbers How many groups of 4 things do we have? 3 This means we get 3 sums of 4 things. This idea extends to higher rank tensors as well. Let me know if this helps.
@hamedhojatian3098
@hamedhojatian3098 5 жыл бұрын
Perfect! Thank you!
@deeplizard
@deeplizard 5 жыл бұрын
Hey Hamed - You are welcome!
@shanthalvasanth7782
@shanthalvasanth7782 4 жыл бұрын
It makes sense that with t being a 3x4 tensor, the output of t.sum(dim=0) has the shape (1x4) but I don't get why the output tensor resulting from t.sum(dim=1) does not have the shape (3x1) ? It is unintuitive that sum over dim=1 yields a 1x3 tensor. EDIT : Just looked up Pytorch documentation - apparently you need to pass 'keepdim=True' to retain the dimensions and get a (3x1) tensor when summing over dim=1.
@elibelash8238
@elibelash8238 5 жыл бұрын
Perfect, thank you.
@deeplizard
@deeplizard 5 жыл бұрын
Hey Eli - You are welcome!
@ilfat_khairullin
@ilfat_khairullin 3 жыл бұрын
Thank you!
@tingnews7273
@tingnews7273 6 жыл бұрын
I thought I have figure out the dim torch.sum(dim=0)。But when I see the torch.argmax(dim=0)。I confused again。 It appears to me the logic is oppersite about this two. When go to the nd-array. Things seem to go more complex。Is there any way I can get it crystal
@deeplizard
@deeplizard 6 жыл бұрын
你好刘新新 - The key is to remember that the reduction is element-wise unless we are working on the last dim. The last dim is always just an array of numbers (the scalar components). Suppose we have this tensor: t = [ [1,2,3,4], [0,0,0,0] ] How do we calculate t.sum(dim=0)? We do it by summing corresponding elements across dim-0. There exist two elements running along dim-0. This one: [1,2,3,4] And this one: [0,0,0,0] Since these are arrays, the sum must be element-wise. This gives us the following: [1+0,2+0,3+0,4+0] = [1,2,3,4] Now, with the argmax(dim=0) operation, it's the same only the operation is different. This time the operation is not sum, but argmax. Argmax gives us the index location of the max value. We have two possible index values, 0 or 1. This is because we are working with two elements. t[0] = [1,2,3,4] t[1] = [0,0,0,0] When we do the argmax, we argmax corresponding elements (element-wise) across dim-0. [argmax(1,0), argmax(2,0), argmax(3,0), argmax(4,0)] = [0,0,0,0] Let me know if this helps and if you still have questions. I think the best way to make this crystal is to create some random examples and practice working them out. It is tricky.
@tingnews7273
@tingnews7273 6 жыл бұрын
@@deeplizard Thank you for your reply. I spend some time to watch the video again and again ,read your reply and try it in code and calculate by pen. I think I get some intuition . But not that clear. I take it as a little progress . I think the most important thing to understand it is element-wise. base different dim . different element. I can figure out the argmax(dims=0) , use this perspective is element is a tensor ,so when we go through every scalar in the tensor , we ask the index of the tensor element. I found in the deep learning , if you want to understand more . you must get the basic ideas clear. I have doing the job for about 1 year more. Still so many basic concepts not that clear. Thank you very much.
@rubencg195
@rubencg195 6 жыл бұрын
Awesome!
@deeplizard
@deeplizard 6 жыл бұрын
Thanks Ruben! Really appreciate your comments!
@donfeto7636
@donfeto7636 2 жыл бұрын
love this
@reb5642
@reb5642 5 жыл бұрын
Thank you so much sir. Your video is so great
@deeplizard
@deeplizard 5 жыл бұрын
Hey REB - Thank you!
@marwaneboudraa9917
@marwaneboudraa9917 2 жыл бұрын
best explanations ever
@gactve2110
@gactve2110 4 жыл бұрын
I would drop the bassy sound effect when you type, it's anoyying when usinf earphones. Except that, Great content! Keep it up!
@deeplizard
@deeplizard 4 жыл бұрын
Hey gactve - I appreciate the straight forward way you described the issue with the sound effect. Thanks! That sound has been removed in newer releases. 😊
@AnmolGargmed
@AnmolGargmed 6 жыл бұрын
at 3:40, isn`t that rank 1 matrix ?
@deeplizard
@deeplizard 6 жыл бұрын
Hey Anmol - The tensor t at 3:40 is a rank 2 tensor. There are two indices required to access an element inside t. Refer here: deeplizard.com/learn/video/AiyK0idr4uM If it still isn't clear, let me know more about how you are thinking about it. Hope this helps.
@takeyoshix
@takeyoshix 4 жыл бұрын
"You gotta step up your game, mate!"... care to share the link to that hilarious woman?
@deeplizard
@deeplizard 4 жыл бұрын
Link to her video in the description :D (Name: Sorelle)
@onePunch95
@onePunch95 5 жыл бұрын
tensor([1, 2, 3, 4]) >>> t.sum(dim=1) tensor([10, 0]) In this case, why don't we get [10]?
@deeplizard
@deeplizard 5 жыл бұрын
Hi Satwik - I can't be sure because I cannot see the value of the tensor t. Note that the tensor shown tensor([1,2,3,4]) is a single dimensional tensor.
@heller4196
@heller4196 5 жыл бұрын
4:30 LMAO
@deeplizard
@deeplizard 5 жыл бұрын
Hey Omkar - 🤣 Thanks for your comments!
@jacoblawrence1138
@jacoblawrence1138 5 жыл бұрын
the sound you have added when typing the code i mean like a bhooo...... kind of sound is really annoying and it gets irritated when watching the videos for continuously. just take this as a suggestion from a viewer.
@deeplizard
@deeplizard 5 жыл бұрын
Thanks Jacob. Are you using headphones? Which ones?
@ennardzhao1871
@ennardzhao1871 5 жыл бұрын
stop putting scary clips into your tutorial videos!!!!
Dataset for Deep Learning - Fashion MNIST
16:04
deeplizard
Рет қаралды 29 М.
Увеличили моцареллу для @Lorenzo.bagnati
00:48
Кушать Хочу
Рет қаралды 7 МЛН
ТЮРЕМЩИК В БОКСЕ! #shorts
00:58
HARD_MMA
Рет қаралды 2,2 МЛН
FOREVER BUNNY
00:14
Natan por Aí
Рет қаралды 10 МЛН
Why Neural Networks can learn (almost) anything
10:30
Emergent Garden
Рет қаралды 1,2 МЛН
Callable Neural Networks - Linear Layers in Depth
13:21
deeplizard
Рет қаралды 25 М.
CNN Weights - Learnable Parameters in PyTorch Neural Networks
23:51
Build PyTorch CNN - Object Oriented Neural Networks
23:23
deeplizard
Рет қаралды 51 М.
How might LLMs store facts | DL7
22:43
3Blue1Brown
Рет қаралды 745 М.
The OTHER AI Alignment Problem: Mesa-Optimizers and Inner Alignment
23:24
Robert Miles AI Safety
Рет қаралды 231 М.
Увеличили моцареллу для @Lorenzo.bagnati
00:48
Кушать Хочу
Рет қаралды 7 МЛН