Some days the internet makes me sad. Other days it reminds me of all the people with the same niche interests as me and how incredibly talented some of them are. Thanks for putting so much effort into this :)
@0mean1sigma3 күн бұрын
Thanks a lot. Glad you liked the video 😃
@psibarpsi7 сағат бұрын
*skilled. not talented. talent is god-given. skill is developed through practice. I think it's a little disrespectful to call someone talented - almost makes it seem like they didn't work for it. 🙂
@Otakutaru3 күн бұрын
"MINI" Project? What the heck?! You just munched a lot of hard to grasp technical implementations, coded a working example, shared it on your blog, AND made a fully animated video about it!! You make me mad.
@0mean1sigma3 күн бұрын
😅
@salmaalfawal6155Күн бұрын
That was a click bait for me, cause it ain't MINI at all
@sanesanyo13 сағат бұрын
Great stuff Tushar. I have been keen on learning GPU programming so great to see your videos in my feed. Keep it up and all the best.
@0mean1sigma12 сағат бұрын
Great to hear! 😀
@nathanpotter13344 күн бұрын
Yay, CUDA video. Feel like my timeline has been needing CUDA content
@eyannoronha8312 күн бұрын
I am reading the CUDA C programming book and your videos are super helpful in visualizing the memory access process! Thank you very much!
@0mean1sigma2 күн бұрын
Glad it was helpful!
@markzakharyanКүн бұрын
dude this is actually amazing. you’re the cs version of 3b1b… keep up the great work!
@0mean1sigma22 сағат бұрын
Thanks a lot! I appreciate it 😃
@jorgegimenezperez93983 күн бұрын
Hey, I’ve been going through the Programming Massively Parallel Processors book lately and doing some CUDA and this was a GREAT video!!!
@0mean1sigma3 күн бұрын
Thanks a lot! Glad the video was helpful!
@jojodi10 сағат бұрын
Good job. The reason workgroups are laid out in 1d/2d/3d grids is that all GPU compute APIs were first designed and implemented on top of existing graphics concepts where calculating outputs as e.g. 2D grids is a natural thing.
@epic_mole2 күн бұрын
I am from embedded systems background but I love your work. Keep Going brother, Just don't quit ! There's always an audience for great content.
@0mean1sigma2 күн бұрын
Thanks a lot! I’m in this for the long run 😃
@antoninmeunierКүн бұрын
This is not a "mini" project, you made some real content here! Fantastic video, congrats!
@0mean1sigma22 сағат бұрын
Thanks! 😃
@Friedbutter101Күн бұрын
Super satisfying to see Manim to show the algorithm like that.
@0mean1sigmaКүн бұрын
Glad you enjoy it!
@anwar63363 күн бұрын
Great video. I love the simplicity, and the great explanation.
@0mean1sigma2 күн бұрын
Thanks. Glad you found it useful.
@MariusNiveriHHСағат бұрын
Crazy good manim skills! perfect video
@0mean1sigma26 минут бұрын
Appreciate it!
@slowedreverb6819Күн бұрын
This is so beautiful and magnificent to see ❤❤❤🎉
@0mean1sigma22 сағат бұрын
Thank you so much!
@qwickstart2 күн бұрын
Beautiful visualization!! i am enjoying watching your videos. Keep up the good work
@0mean1sigma2 күн бұрын
Thank you! Cheers!
@zeugzeugzeug2 күн бұрын
Thanks for the research! Keep going! I would like to see other algorithms being run an optimized on GPUs...
@huzaifamalik23463 күн бұрын
Your content is very helpfull and your method teach is great
@siddharth-gandhi4 күн бұрын
Beautiful video as usual. I'll am motivated to pick up PMPP after sem end just from watching your videos!
@0mean1sigma4 күн бұрын
Thanks a lot and good luck 😃
@taufiquzzamanpeyash60083 күн бұрын
High quality content. Subscribed.
@0mean1sigma3 күн бұрын
Thanks a lot. I really appreciate it 😃
@theintjengineer4 күн бұрын
Amazing🎉. Thank you!
@appleWhisky432 күн бұрын
Gonna enjoy this knowledge
@JoeyLutes2 күн бұрын
yay i know a little about this now, thank you!!
@TheVirtualArena24Күн бұрын
I'm so dumb to understand this but I know this is something good. I'll understand it someday
@JohnMulleeСағат бұрын
really nice stuff, thanks
@0mean1sigma26 минут бұрын
Glad you liked it!
@7uz330Күн бұрын
i study linear algebra and im shocked right now now cuz it’s important to program the hardware system
@brianmeehan62352 күн бұрын
Nice Manim work!
@w花bКүн бұрын
Can you talk about scan operations like Blelloch and Hillis/Steele algorithms? Maybe you've already talked about them I haven't seen all your videos. This would be nice to know in what context they're used and provide a cool visualisation of them too.
@0mean1sigma22 сағат бұрын
I can take these topics for some future videos. Thanks a lot for suggesting. 😀
@trapkat8213Күн бұрын
Great content! One thing I have never understood is, for matrix multiplication, what is the sort of threshold in size that makes a GPU implementation faster than a CPU implementation? If I want to do one million multiplications of 4 x 4 matrices, ignoring the overhead required to set up the computations, is the GPU faster than the CPU? Surely not. What about 100 x 100? 1000 x 1000?
@0mean1sigma22 сағат бұрын
GPUs are suited for large dataset. I can’t specify a number as it will depend on the algorithm and GPU specs.
@uonliaquat79573 күн бұрын
Thanks for the video, from where did you learn all this stuff? Any book or course?
@0mean1sigma3 күн бұрын
I've put links to a couple of Blog posts in the description. They were very helpful (especially when it came to verifying my code).
@uonliaquat79573 күн бұрын
But you didn’t take any course right?
@0mean1sigma3 күн бұрын
Nope
@gauravpatil1630Күн бұрын
What did you do? Compared with GFLOP you developed your own function to handle matrix mul ?
@0mean1sigma22 сағат бұрын
I wrote SGeMM from scratch (that runs on a gpu)
@gauravpatil163016 сағат бұрын
@ got it 🔥great man !!
@erichpoly44344 күн бұрын
Thans for you Service
@engineeredarmy11522 күн бұрын
Yet another Indian banger video
@spacepxl3 күн бұрын
What was the reason for surprise at the x-horizontal / y-vertical layout? It's the standard convention for image processing, which is what GPUs are designed for
@0mean1sigma3 күн бұрын
y-axis is generally vertical up and I'm not bothered too much about this as well. What confused me was (z, y, x). I understand that GPUs weren't designed to work with these kind of computations but it always confused me (when I started out).
@acasualviewer5861Күн бұрын
@@0mean1sigma i think you mean i,j vs x,y.. i often means the row in matrix operations. But x is always horizontal in cartesian coordinates.
@nikhilkande3364 күн бұрын
I was hoping that someone would simplify GPU's Matrix multiplication to me , so thank you .
@0mean1sigma4 күн бұрын
Glad you found it useful 😃
@Vuden13Күн бұрын
W project
@vkrotevich61713 күн бұрын
So, after trying your benchmarks, I've got that cuBLAS is fastest in comparison to any of your approach. Though, nice video
@0mean1sigma3 күн бұрын
Thanks a lot for your comment. Yes, my implementations are slower than CUDA but my focus was more on understanding the GPU programming concepts.
@AbhishekGupta-ny8ig2 күн бұрын
i have a question ... from where can i learn these concepts
@0mean1sigma2 күн бұрын
I’ve provided some of the links in the video description. There are also some good textbooks. Good luck!
@__logan__duvalier__2 күн бұрын
are these floating point or integer matrices ?
@0mean1sigma2 күн бұрын
Float
@__logan__duvalier__2 күн бұрын
@@0mean1sigma thank you for sharing your GPU coding experience
@croncoder8622 күн бұрын
Are you using manim?
@LostAdmin2 күн бұрын
Yes he is
@juliansantos19002 күн бұрын
Why it feels like 3blue1brown vid... You use manim???