This is literally insane how well you explained this I instantly subbed you deserve so much more attention
@AladdinPersson3 жыл бұрын
Wow thanks :)
@udbhavprasad35213 жыл бұрын
Honestly, there is no channel that even compares to this level of quality
@misabic14994 жыл бұрын
Hi. Your model building from scratch tutorials are really helpful. Eagerly waiting for more tutorials to come. I really appreciate it!
@AladdinPersson4 жыл бұрын
I appreciate the kind words! Any video in particular that you thought were good and do you have any specific suggestions for the future?
@qiguosun1293 жыл бұрын
This is literally the best and simplest explanation I ever had, thanks.
@rajanalexander4949 Жыл бұрын
Excellent tutorial of a very useful but sometimes confusing feature in NumPy. I would only add that " . . . " is syntactic sugar for omitting a bunch of indices.
@johngrabner4 жыл бұрын
Another perfect video. Most valuable because it provides a foundation for your other video. Can't wait for your next einsum video.
@AladdinPersson4 жыл бұрын
Really appreciate your comment! :)
@SantoshGupta-jn1wn2 жыл бұрын
One of the most important videos I've ever seen.
@mayankkamboj4025 Жыл бұрын
Wow, I finally get einsum ! Thank you so much. And that lotr reference was good.
@gauravmenghani42 жыл бұрын
Lovely. I always found einsum non-intuitive. Learnt a lot! Thanks :)
@stacksmasherninja72663 жыл бұрын
It almost felt like you implemented these functions yourself in those libraries ! Great video
@CrazyProgrammer16 Жыл бұрын
Hey, but why does "i,j->ij" also have a product??? Here in the input nothing is repeating. Are there other rules?
@bourahmamasten4602Ай бұрын
nicely explained, thank you!
@rekeshwardhanani9202 жыл бұрын
Insane brother, excellent just excellent
@kenzhebektaniyev8180 Жыл бұрын
cool! tbh I didn't believe you could explain it but you did
@iskrabesamrtna3 жыл бұрын
i had to translate it to tensorflow :) very useful video for practice. thank you!
@thecros10764 жыл бұрын
Learnt something new today❤️❤️, ...I always had a question how and were did you learn everything?
@AladdinPersson4 жыл бұрын
I don't know all of this stuff. I research everything to try to make every video as good as I possible can so the process is usually that I learn something in depth and then decide to share it with you guys
@thecros10764 жыл бұрын
@@AladdinPersson ❤️❤️❤️loved all of your videos ... hardwork and talent is a deadly combination ....hope to see new project videos soon❤️
@francesco_savi4 жыл бұрын
nice explanation, very clear! thanks!
@valevan14 Жыл бұрын
This is great, I just wanna know however, if I can do FFT of Green function using einsum. Note: been trying for a week to implement the code, never got the correct result.
@minma022623 жыл бұрын
Thank you for sharing this!
@iskhwa2 жыл бұрын
Thanks, a perfect explanation.
@javidhesenov7611 Жыл бұрын
thanks for awesome explanation
@leofh19173 жыл бұрын
Thanx! This one is very useful!
@parasharchatterjee32232 жыл бұрын
It's the Einstein summation convention that's used in physics very commonly, and just removes the clunky summation sign in pages long calculations!
@fergalhennessy77519 күн бұрын
wow, the matrix diagonal operations feel almost like an abuse of notation to me. What would happen if the matrix wasn't square and had more rows than columns?
@michaelmoran90203 жыл бұрын
Are the "free indicies" part of standard einstein notation or something made up to allow you to exclude array dimensions from the einsum entirely?
@Raghhuveer2 жыл бұрын
How does it compare in terms of performance and efficiency to standard numpy function calls?
@danyalziakhan3 жыл бұрын
So, basically einsum is the DSL that is shared between these libraries, right?
@haideralishuvo47814 жыл бұрын
Awesome , Your channel is so underrated . Was struggling for a good channel to learn about pytorch ,Thanksfully got yours :D Can you cover pix2pix , cycleGAN , RCNN's ? Would be greatful if you do .
@AladdinPersson4 жыл бұрын
Appreciate you 👊 Many people have requested that so it's coming but can't promise when :)
@fergalhennessy77519 күн бұрын
great video!
@epolat193 жыл бұрын
Does einsum mess the auto-differentiation of TensorFlow
@iskrabesamrtna3 жыл бұрын
gives me error while matrix-vector multipication: torch.einsum("ij, kj->ik", x, v) einsum(): operands do not broadcast with remapped shapes [original->remapped]: [2, 5]->[2, 1, 5] [1, 3]->[1, 1, 3] same in tf Expected dimension 5 at axis 1 of the input shaped [1,3] but got dimension 3 [Op:Einsum]
@deoabhijit59353 жыл бұрын
are you considering doing an another video on advanced einsum?
@johnj78834 жыл бұрын
Thanks a lot. it saves my day
@Han-ve8uh3 жыл бұрын
One thing that wasn't mentioned in the video that i realized halfway through is sometimes einsum is used on 1 operand while sometimes on 2. I tried "torch.einsum('ii->i', t,t)" and got "RuntimeError: einsum(): more operands were provided than specified in the equation". This tells me that the number of operands must correspond to the number of comma separated indexes on left hand side of ->.
@iskrabesamrtna3 жыл бұрын
einsum to rule them all, indeed.
@cassenav2 жыл бұрын
Great video thanks :)
@alfahimmohammad4 жыл бұрын
will einsen work for model parallelism in keras models?
@AladdinPersson4 жыл бұрын
I haven't tried that but I would imagine that it works
@alfahimmohammad4 жыл бұрын
@@AladdinPersson I tried it. It wasn't good. I was better off with manually assigning each layer to each GPU in pytorch
@SAINIVEDH3 жыл бұрын
can someone explain how matrix diagonal is "ii->i" ?
@ericmink3 жыл бұрын
I think it's because if you wrote it as a nested loop, then you would loop over all rows with a variable `i`, and for the columns you would reuse the same variable (every entry at coordinates (i,i) is on the diagonal). Now for the result, if you left the `i` out it would sum the diagonal elements up. If you have it in there, it will create a list instead.
@Choiuksu4 жыл бұрын
What a nice video !
@AladdinPersson4 жыл бұрын
Thank you so much :)
@jeanchristophe154 жыл бұрын
I am not sure the "Batch matrix multiplication" example is correct, because i is used twice.
@jamgplus3343 жыл бұрын
nicely done
@rockapedra11303 жыл бұрын
Very cool!
@MorisonMs4 жыл бұрын
3:37 (Outer product) there is no need to sum, simply M[i,j] = A[i,k]*B[k,j]
@lewis28653 жыл бұрын
It's matrix multiplication
@ALVONIUM Жыл бұрын
Helt otroligt
@gtg238s4 жыл бұрын
Great explanation! click
@AladdinPersson4 жыл бұрын
Thank you so much! :)
@ripsirwin13 жыл бұрын
This is so difficult to understand I don't know if I'll ever get it.
@AladdinPersson3 жыл бұрын
Sorry, maybe I didn't explain it good enough:/
@ripsirwin13 жыл бұрын
@@AladdinPersson no you're great. I just have to work at it
@AlbertMunda4 жыл бұрын
awesome
@AndyLee-xq8wq10 ай бұрын
cool
@hieunguyentrung89873 жыл бұрын
np.einsum('ik,kj->ij', x,y) is actually much much slower than np.dot(x,y) when the matrix size gets larger Also tf.einsum is slightly slower than tf.matmul but torch.einsum is slightly faster than torch.matmul... Only from a perspective of the configuration of my laptop though