Code for Deep Learning - ArgMax and Reduction Tensor Ops

  Рет қаралды 28,017

deeplizard

deeplizard

Күн бұрын

Пікірлер: 69
@deeplizard
@deeplizard 6 жыл бұрын
Who's seen the movie IT? If you've seen it, tell us: 1) What country did you see it in? 2) On a scale from 0% - 100%, what's your rating of the movie? Check out the corresponding blog and other resources for this video at: deeplizard.com/learn/video/K3lX3Cltt4c
@sourajitsaha4724
@sourajitsaha4724 5 жыл бұрын
Bangladesh, 95%
@victortarnovskiy8407
@victortarnovskiy8407 5 жыл бұрын
Russia, 90%. It is truly a masterpiece.
@rajeshviky
@rajeshviky 5 жыл бұрын
Finland 83%
@林盟政
@林盟政 5 жыл бұрын
Taiwan, IT version 85%, element-wise version 120%
@min-youngchoi3833
@min-youngchoi3833 4 жыл бұрын
South Korea / 100% and might be overflow
@thepresistence5935
@thepresistence5935 2 жыл бұрын
I literally wasted more than 6 hours watching this continuously and successfully completed 11 video. It's like watching a movie :) Thanks a lot, I am understanding everything.
@ioannis.g.tzolas
@ioannis.g.tzolas 6 жыл бұрын
Congratulations on the terrific content and style of presentation. Your videos are like watching a TV series (with tensors as heroes). Can't stop watching !!! Congratulations !
@deeplizard
@deeplizard 6 жыл бұрын
Wow! Tensors as super heros! Love it! Thank you! 📺
@mohammadsadilkhan1875
@mohammadsadilkhan1875 5 жыл бұрын
Correct me if I am wrong. This is the most interesting as well as a great resource for learning. Your idea of putting videos of pioneers in this field is truly marvellous.
@stormzrift4575
@stormzrift4575 7 ай бұрын
This has been fantastic so far. The memes make it so much more engaging
@hzmuhabbet
@hzmuhabbet 2 жыл бұрын
I am happy to find that channel, there's a great work here mate. Thanks for your efforts.
@user-or7ji5hv8y
@user-or7ji5hv8y 6 жыл бұрын
Really finding this series rewarding. very much motivated to learn PyTorch.
@deeplizard
@deeplizard 6 жыл бұрын
Thank you, James! I'm glad to hear that!
@philtrem
@philtrem 5 жыл бұрын
same !
@_edwardedberg
@_edwardedberg 5 жыл бұрын
Dude.. your videos should be the world standard of neural network education.
@deeplizard
@deeplizard 5 жыл бұрын
Thank you, Edward!
@Brahma2012
@Brahma2012 5 жыл бұрын
This is one of the best trainings on PyTorch. The coverage is highly relevant and exhaustive. Thank you
@techfirsttamil
@techfirsttamil 6 жыл бұрын
No words to thank you guys. Always raising bar of quality content about deep learning. Heartly thanks from India
@deeplizard
@deeplizard 6 жыл бұрын
Thank you, Venkatesh! Appreciate that!
@bisnar1307
@bisnar1307 4 жыл бұрын
Thank you for this series. It's extremely helpful. Please continue teaching people more!
@sumanmondal2152
@sumanmondal2152 4 жыл бұрын
You're way of teaching is amazing !!
@harishjulapalli448
@harishjulapalli448 4 жыл бұрын
Great Video. I got the concepts very clear now. Thanks a lot for your efforts.
@kyle_bro
@kyle_bro 4 жыл бұрын
This really is the highest level of content. 10/10
@Fighter_Believer_Achiever
@Fighter_Believer_Achiever 5 ай бұрын
Bro brilliant work. Legendary!!
@DanielWeikert
@DanielWeikert 6 жыл бұрын
Love you guys. Great video again. Highly appreciated.
@deeplizard
@deeplizard 6 жыл бұрын
Thanks Daniel! 🚀
@Gee.U
@Gee.U 5 жыл бұрын
this channel is a blessing, thank you!
@saptarshidattaaot
@saptarshidattaaot 4 жыл бұрын
Will never ever forget element-wise operation
@ferielferiel5006
@ferielferiel5006 5 жыл бұрын
OOOh my god !!! I was watching this video late at night !! and you hell scared me
@deeplizard
@deeplizard 5 жыл бұрын
🤣🤣🤣🤣🤣
@haroldsu1696
@haroldsu1696 4 жыл бұрын
really awesome lectures.
@herrmann5384
@herrmann5384 5 жыл бұрын
I dont get, why the order is changed, when you set a number for "dim": For example: t = torch.tensor([ [1,1,1,1], [2,2,2,2], [3,3,3,3] ]) print(t.shape) = [3,4] = 3 = first rows, 4 = then columns . But when you use the sum method: print(t.sum(dim = 0)) = tensor([6, 6, 6, 6]) dim = 0 : adds the columns and dim = 1 the rows.... why is it the other way around? Thanks so much!
@deeplizard
@deeplizard 5 жыл бұрын
Thinking in terms of rows and columns can be problematic when dealing with tensors because they are more abstract data structures. Instead of saying we have 3 rows and 4 columns. Let's say we have 3 things and for each of these we have 4 things. Where are the 3 things? dim=0 What are the 3 things? rank-1 tensors (arrays) of length 4 How do we sum rank-1 tensors together? via element-wise addition The last part (element-wise) is the key. When we take the sum with respect to dim=0, we are summing the elements that run on that dimension. If these elements are not numbers, the sum is an element-wise sum. Since each array has a length of 4 (4 indicies), the result has a length of 4. The same logic applies to dim=1. What are the 4 things. numbers How many groups of 4 things do we have? 3 This means we get 3 sums of 4 things. This idea extends to higher rank tensors as well. Let me know if this helps.
@hamedhojatian3098
@hamedhojatian3098 5 жыл бұрын
Perfect! Thank you!
@deeplizard
@deeplizard 5 жыл бұрын
Hey Hamed - You are welcome!
@elibelash8238
@elibelash8238 6 жыл бұрын
Perfect, thank you.
@deeplizard
@deeplizard 6 жыл бұрын
Hey Eli - You are welcome!
@rubencg195
@rubencg195 6 жыл бұрын
Awesome!
@deeplizard
@deeplizard 6 жыл бұрын
Thanks Ruben! Really appreciate your comments!
@AnmolGargmed
@AnmolGargmed 6 жыл бұрын
at 3:40, isn`t that rank 1 matrix ?
@deeplizard
@deeplizard 6 жыл бұрын
Hey Anmol - The tensor t at 3:40 is a rank 2 tensor. There are two indices required to access an element inside t. Refer here: deeplizard.com/learn/video/AiyK0idr4uM If it still isn't clear, let me know more about how you are thinking about it. Hope this helps.
@reb5642
@reb5642 5 жыл бұрын
Thank you so much sir. Your video is so great
@deeplizard
@deeplizard 5 жыл бұрын
Hey REB - Thank you!
@ilfat_khairullin
@ilfat_khairullin 3 жыл бұрын
Thank you!
@shanthalvasanth7782
@shanthalvasanth7782 4 жыл бұрын
It makes sense that with t being a 3x4 tensor, the output of t.sum(dim=0) has the shape (1x4) but I don't get why the output tensor resulting from t.sum(dim=1) does not have the shape (3x1) ? It is unintuitive that sum over dim=1 yields a 1x3 tensor. EDIT : Just looked up Pytorch documentation - apparently you need to pass 'keepdim=True' to retain the dimensions and get a (3x1) tensor when summing over dim=1.
@tingnews7273
@tingnews7273 6 жыл бұрын
I thought I have figure out the dim torch.sum(dim=0)。But when I see the torch.argmax(dim=0)。I confused again。 It appears to me the logic is oppersite about this two. When go to the nd-array. Things seem to go more complex。Is there any way I can get it crystal
@deeplizard
@deeplizard 6 жыл бұрын
你好刘新新 - The key is to remember that the reduction is element-wise unless we are working on the last dim. The last dim is always just an array of numbers (the scalar components). Suppose we have this tensor: t = [ [1,2,3,4], [0,0,0,0] ] How do we calculate t.sum(dim=0)? We do it by summing corresponding elements across dim-0. There exist two elements running along dim-0. This one: [1,2,3,4] And this one: [0,0,0,0] Since these are arrays, the sum must be element-wise. This gives us the following: [1+0,2+0,3+0,4+0] = [1,2,3,4] Now, with the argmax(dim=0) operation, it's the same only the operation is different. This time the operation is not sum, but argmax. Argmax gives us the index location of the max value. We have two possible index values, 0 or 1. This is because we are working with two elements. t[0] = [1,2,3,4] t[1] = [0,0,0,0] When we do the argmax, we argmax corresponding elements (element-wise) across dim-0. [argmax(1,0), argmax(2,0), argmax(3,0), argmax(4,0)] = [0,0,0,0] Let me know if this helps and if you still have questions. I think the best way to make this crystal is to create some random examples and practice working them out. It is tricky.
@tingnews7273
@tingnews7273 6 жыл бұрын
@@deeplizard Thank you for your reply. I spend some time to watch the video again and again ,read your reply and try it in code and calculate by pen. I think I get some intuition . But not that clear. I take it as a little progress . I think the most important thing to understand it is element-wise. base different dim . different element. I can figure out the argmax(dims=0) , use this perspective is element is a tensor ,so when we go through every scalar in the tensor , we ask the index of the tensor element. I found in the deep learning , if you want to understand more . you must get the basic ideas clear. I have doing the job for about 1 year more. Still so many basic concepts not that clear. Thank you very much.
@thepresistence5935
@thepresistence5935 2 жыл бұрын
4:53 best moment :)
@donfeto7636
@donfeto7636 2 жыл бұрын
love this
@Nikage23
@Nikage23 4 жыл бұрын
I wish I'd found this explanations earlier
@marwaneboudraa9917
@marwaneboudraa9917 2 жыл бұрын
best explanations ever
@onePunch95
@onePunch95 5 жыл бұрын
tensor([1, 2, 3, 4]) >>> t.sum(dim=1) tensor([10, 0]) In this case, why don't we get [10]?
@deeplizard
@deeplizard 5 жыл бұрын
Hi Satwik - I can't be sure because I cannot see the value of the tensor t. Note that the tensor shown tensor([1,2,3,4]) is a single dimensional tensor.
@gactve2110
@gactve2110 4 жыл бұрын
I would drop the bassy sound effect when you type, it's anoyying when usinf earphones. Except that, Great content! Keep it up!
@deeplizard
@deeplizard 4 жыл бұрын
Hey gactve - I appreciate the straight forward way you described the issue with the sound effect. Thanks! That sound has been removed in newer releases. 😊
@takeyoshix
@takeyoshix 4 жыл бұрын
"You gotta step up your game, mate!"... care to share the link to that hilarious woman?
@deeplizard
@deeplizard 4 жыл бұрын
Link to her video in the description :D (Name: Sorelle)
@heller4196
@heller4196 5 жыл бұрын
4:30 LMAO
@deeplizard
@deeplizard 5 жыл бұрын
Hey Omkar - 🤣 Thanks for your comments!
@ennardzhao1871
@ennardzhao1871 5 жыл бұрын
stop putting scary clips into your tutorial videos!!!!
@jacoblawrence1138
@jacoblawrence1138 5 жыл бұрын
the sound you have added when typing the code i mean like a bhooo...... kind of sound is really annoying and it gets irritated when watching the videos for continuously. just take this as a suggestion from a viewer.
@deeplizard
@deeplizard 5 жыл бұрын
Thanks Jacob. Are you using headphones? Which ones?
Dataset for Deep Learning - Fashion MNIST
16:04
deeplizard
Рет қаралды 29 М.
[1hr Talk] Intro to Large Language Models
59:48
Andrej Karpathy
Рет қаралды 2,3 МЛН
“Don’t stop the chances.”
00:44
ISSEI / いっせい
Рет қаралды 62 МЛН
My scorpion was taken away from me 😢
00:55
TyphoonFast 5
Рет қаралды 2,7 МЛН
99.9% IMPOSSIBLE
00:24
STORROR
Рет қаралды 31 МЛН
СИНИЙ ИНЕЙ УЖЕ ВЫШЕЛ!❄️
01:01
DO$HIK
Рет қаралды 3,3 МЛН
Tutorial-8:Failure of perceptron model
16:47
Algorithm Avenue
Рет қаралды 33
Generative Model That Won 2024 Nobel Prize
33:04
Artem Kirsanov
Рет қаралды 243 М.
CNN Weights - Learnable Parameters in PyTorch Neural Networks
23:51
The Elegant Math Behind Machine Learning
1:53:12
Machine Learning Street Talk
Рет қаралды 128 М.
Where Does Bad Code Come From?
42:21
Molly Rocket
Рет қаралды 205 М.
“Don’t stop the chances.”
00:44
ISSEI / いっせい
Рет қаралды 62 МЛН