Einsum Is All You Need: NumPy, PyTorch and TensorFlow

  Рет қаралды 44,587

Aladdin Persson

Aladdin Persson

Күн бұрын

Пікірлер: 61
@udbhavprasad3521
@udbhavprasad3521 3 жыл бұрын
Honestly, there is no channel that even compares to this level of quality
@matt.jordan
@matt.jordan 3 жыл бұрын
This is literally insane how well you explained this I instantly subbed you deserve so much more attention
@AladdinPersson
@AladdinPersson 3 жыл бұрын
Wow thanks :)
@qiguosun129
@qiguosun129 3 жыл бұрын
This is literally the best and simplest explanation I ever had, thanks.
@johngrabner
@johngrabner 4 жыл бұрын
Another perfect video. Most valuable because it provides a foundation for your other video. Can't wait for your next einsum video.
@AladdinPersson
@AladdinPersson 4 жыл бұрын
Really appreciate your comment! :)
@rajanalexander4949
@rajanalexander4949 Жыл бұрын
Excellent tutorial of a very useful but sometimes confusing feature in NumPy. I would only add that " . . . " is syntactic sugar for omitting a bunch of indices.
@mayankkamboj4025
@mayankkamboj4025 9 ай бұрын
Wow, I finally get einsum ! Thank you so much. And that lotr reference was good.
@SantoshGupta-jn1wn
@SantoshGupta-jn1wn 2 жыл бұрын
One of the most important videos I've ever seen.
@stacksmasherninja7266
@stacksmasherninja7266 2 жыл бұрын
It almost felt like you implemented these functions yourself in those libraries ! Great video
@iva1389
@iva1389 2 жыл бұрын
i had to translate it to tensorflow :) very useful video for practice. thank you!
@kenzhebektaniyev8180
@kenzhebektaniyev8180 Жыл бұрын
cool! tbh I didn't believe you could explain it but you did
@CrazyProgrammer16
@CrazyProgrammer16 Жыл бұрын
Hey, but why does "i,j->ij" also have a product??? Here in the input nothing is repeating. Are there other rules?
@rekeshwardhanani920
@rekeshwardhanani920 Жыл бұрын
Insane brother, excellent just excellent
@thecros1076
@thecros1076 4 жыл бұрын
Learnt something new today❤️❤️, ...I always had a question how and were did you learn everything?
@AladdinPersson
@AladdinPersson 4 жыл бұрын
I don't know all of this stuff. I research everything to try to make every video as good as I possible can so the process is usually that I learn something in depth and then decide to share it with you guys
@thecros1076
@thecros1076 4 жыл бұрын
@@AladdinPersson ❤️❤️❤️loved all of your videos ... hardwork and talent is a deadly combination ....hope to see new project videos soon❤️
@haideralishuvo4781
@haideralishuvo4781 4 жыл бұрын
Awesome , Your channel is so underrated . Was struggling for a good channel to learn about pytorch ,Thanksfully got yours :D Can you cover pix2pix , cycleGAN , RCNN's ? Would be greatful if you do .
@AladdinPersson
@AladdinPersson 4 жыл бұрын
Appreciate you 👊 Many people have requested that so it's coming but can't promise when :)
@iskhwa
@iskhwa 2 жыл бұрын
Thanks, a perfect explanation.
@francesco_savi
@francesco_savi 3 жыл бұрын
nice explanation, very clear! thanks!
@johnj7883
@johnj7883 4 жыл бұрын
Thanks a lot. it saves my day
@iva1389
@iva1389 2 жыл бұрын
einsum to rule them all, indeed.
@Han-ve8uh
@Han-ve8uh 2 жыл бұрын
One thing that wasn't mentioned in the video that i realized halfway through is sometimes einsum is used on 1 operand while sometimes on 2. I tried "torch.einsum('ii->i', t,t)" and got "RuntimeError: einsum(): more operands were provided than specified in the equation". This tells me that the number of operands must correspond to the number of comma separated indexes on left hand side of ->.
@ALVONIUM
@ALVONIUM Жыл бұрын
Helt otroligt
@valeriusevanligasetiawan6967
@valeriusevanligasetiawan6967 10 ай бұрын
This is great, I just wanna know however, if I can do FFT of Green function using einsum. Note: been trying for a week to implement the code, never got the correct result.
@cassenav
@cassenav 2 жыл бұрын
Great video thanks :)
@iva1389
@iva1389 2 жыл бұрын
gives me error while matrix-vector multipication: torch.einsum("ij, kj->ik", x, v) einsum(): operands do not broadcast with remapped shapes [original->remapped]: [2, 5]->[2, 1, 5] [1, 3]->[1, 1, 3] same in tf Expected dimension 5 at axis 1 of the input shaped [1,3] but got dimension 3 [Op:Einsum]
@Raghhuveer
@Raghhuveer 2 жыл бұрын
How does it compare in terms of performance and efficiency to standard numpy function calls?
@danyalzia6958
@danyalzia6958 3 жыл бұрын
So, basically einsum is the DSL that is shared between these libraries, right?
@Choiuksu
@Choiuksu 4 жыл бұрын
What a nice video !
@AladdinPersson
@AladdinPersson 4 жыл бұрын
Thank you so much :)
@jeanchristophe15
@jeanchristophe15 3 жыл бұрын
I am not sure the "Batch matrix multiplication" example is correct, because i is used twice.
@gtg238s
@gtg238s 4 жыл бұрын
Great explanation! click
@AladdinPersson
@AladdinPersson 4 жыл бұрын
Thank you so much! :)
@SAINIVEDH
@SAINIVEDH 3 жыл бұрын
can someone explain how matrix diagonal is "ii->i" ?
@ericmink
@ericmink 3 жыл бұрын
I think it's because if you wrote it as a nested loop, then you would loop over all rows with a variable `i`, and for the columns you would reuse the same variable (every entry at coordinates (i,i) is on the diagonal). Now for the result, if you left the `i` out it would sum the diagonal elements up. If you have it in there, it will create a list instead.
@AlbertMunda
@AlbertMunda 4 жыл бұрын
awesome
@alfahimmohammad
@alfahimmohammad 3 жыл бұрын
will einsen work for model parallelism in keras models?
@AladdinPersson
@AladdinPersson 3 жыл бұрын
I haven't tried that but I would imagine that it works
@alfahimmohammad
@alfahimmohammad 3 жыл бұрын
@@AladdinPersson I tried it. It wasn't good. I was better off with manually assigning each layer to each GPU in pytorch
@hieunguyentrung8987
@hieunguyentrung8987 3 жыл бұрын
np.einsum('ik,kj->ij', x,y) is actually much much slower than np.dot(x,y) when the matrix size gets larger Also tf.einsum is slightly slower than tf.matmul but torch.einsum is slightly faster than torch.matmul... Only from a perspective of the configuration of my laptop though
@misabic1499
@misabic1499 4 жыл бұрын
Hi. Your model building from scratch tutorials are really helpful. Eagerly waiting for more tutorials to come. I really appreciate it!
@AladdinPersson
@AladdinPersson 4 жыл бұрын
I appreciate the kind words! Any video in particular that you thought were good and do you have any specific suggestions for the future?
@leonardmensahboante4308
@leonardmensahboante4308 2 жыл бұрын
@@AladdinPersson Please do a video on python hooks, thus how to use pre-trained model as the encoder to the UNET architectures for image segmentation.
@gauravmenghani4
@gauravmenghani4 2 жыл бұрын
Lovely. I always found einsum non-intuitive. Learnt a lot! Thanks :)
@ripsirwin1
@ripsirwin1 3 жыл бұрын
This is so difficult to understand I don't know if I'll ever get it.
@AladdinPersson
@AladdinPersson 3 жыл бұрын
Sorry, maybe I didn't explain it good enough:/
@ripsirwin1
@ripsirwin1 3 жыл бұрын
@@AladdinPersson no you're great. I just have to work at it
@AndyLee-xq8wq
@AndyLee-xq8wq 7 ай бұрын
cool
@parasharchatterjee3223
@parasharchatterjee3223 2 жыл бұрын
It's the Einstein summation convention that's used in physics very commonly, and just removes the clunky summation sign in pages long calculations!
@epolat19
@epolat19 3 жыл бұрын
Does einsum mess the auto-differentiation of TensorFlow
@michaelmoran9020
@michaelmoran9020 3 жыл бұрын
Are the "free indicies" part of standard einstein notation or something made up to allow you to exclude array dimensions from the einsum entirely?
@javidhesenov7611
@javidhesenov7611 Жыл бұрын
thanks for awesome explanation
@deoabhijit5935
@deoabhijit5935 3 жыл бұрын
are you considering doing an another video on advanced einsum?
@leofh1917
@leofh1917 3 жыл бұрын
Thanx! This one is very useful!
@minma02262
@minma02262 3 жыл бұрын
Thank you for sharing this!
@rockapedra1130
@rockapedra1130 3 жыл бұрын
Very cool!
@jamgplus334
@jamgplus334 3 жыл бұрын
nicely done
@MorisonMs
@MorisonMs 3 жыл бұрын
3:37 (Outer product) there is no need to sum, simply M[i,j] = A[i,k]*B[k,j]
@lewis2865
@lewis2865 3 жыл бұрын
It's matrix multiplication
Pytorch Seq2Seq Tutorial for Machine Translation
50:55
Aladdin Persson
Рет қаралды 81 М.
Pytorch Transformers from Scratch (Attention is all you need)
57:10
Aladdin Persson
Рет қаралды 309 М.
escape in roblox in real life
00:13
Kan Andrey
Рет қаралды 86 МЛН
Самое неинтересное видео
00:32
Miracle
Рет қаралды 2,9 МЛН
PYTORCH COMMON MISTAKES - How To Save Time 🕒
19:12
Aladdin Persson
Рет қаралды 55 М.
Has Generative AI Already Peaked? - Computerphile
12:48
Computerphile
Рет қаралды 992 М.
Writing Code That Runs FAST on a GPU
15:32
Low Level
Рет қаралды 559 М.
Einsum - The Game-Changing Matrix Operation
7:03
Arthur Lovekin
Рет қаралды 891
OpenAI’s New ChatGPT: 7 Incredible Capabilities!
6:27
Two Minute Papers
Рет қаралды 185 М.
PyTorch in 100 Seconds
2:43
Fireship
Рет қаралды 929 М.
The U-Net (actually) explained in 10 minutes
10:31
rupert ai
Рет қаралды 106 М.
PyTorch Autograd Explained - In-depth Tutorial
13:42
Elliot Waite
Рет қаралды 107 М.
TensorFlow in 100 Seconds
2:39
Fireship
Рет қаралды 961 М.
escape in roblox in real life
00:13
Kan Andrey
Рет қаралды 86 МЛН