This was an awesome explanation! One small question (maybe correction?) - at around 7:20 shouldn't the Gini index of the diverse set be 1 - (0 + 0 + ... +0) since the probability of getting the same element twice is 0 - there are 10 unique elements i.e. no duplicates, so it's impossible to pick two of the same item.
@lancezhang89214 сағат бұрын
Thank you,professor,you have saved my life.
@rohitschauhanitbhu15 сағат бұрын
Your last video was good. But in this video, you didnt explain a lot about the keys and queries vectors (your explanation was very generic). It would have helped if you had deep dived a little more with examples for explaining this. Overall, I really like your videos, but there is scope of improvement. Thanks
@SanjayD-qc9og18 сағат бұрын
Great explanation sir...
@QuantaCompassAnalytics22 сағат бұрын
Luis, I really liked the explanation of pseudo code, 😊 Thankyou for connecting on LinkedIn ❤
@-xx-7674Күн бұрын
This is probably the most friendliest video and still covering all important concepts of RL, thank you
@martaduarteteixeira8336Күн бұрын
Uau! Amazing! Thanks for the simplest explanation I've ever seen 🙂
@divikchoudhary88732 күн бұрын
This is just Gold!!!!!
@NigusBasicEnglish3 күн бұрын
You are the best expainer ever. You are amazing.
@lucynowacki33273 күн бұрын
Stop LGBT & AI. Vote MAGA! Vote Trump!!! AI is heresy.
@madhu13934 күн бұрын
@SerranoAcademy , Thank you this is awesome video. At time slot 25 mins - you are explaining that during the math Q(Orange) * WQ * WK * K(Phone) - WQ and WK helps to transform the embedding. This seems to incorrect - because, even before self attention - q is multiplied with WQ to get Q and same for K. So, the actual math is q *WQ * k * WK - since matrix multiplication is not commutative - you can't do that. So it looks like we are finding the similarity between Q and K where as Q and K belong to different embedding space. Not sure how it actually works as per you explanation. It would he helpful if you could answer it. Thanks.
@user-pz2py4tp3t4 күн бұрын
tf just i saw, you are God
@mohammadsalehdehghanpour98564 күн бұрын
29:00 thanks for the great video. But we can't just pick the greater number each day. We should have recorded the hidden state that led to the bigger probability and use that info to build the path
@user-wm8hy8ce2o4 күн бұрын
what is the true value of V,K,Q matrix ?
@shuang78774 күн бұрын
A professor here - preparing for my couse and tryng to find an easier way to talk about these ideas. I learned a lot! Thank you!
@testtest-ws7uc5 күн бұрын
Wonderful! Thanks so much!
@gunamrit5 күн бұрын
getting to see this after a heavy day at work is refreshing.. Thank you so much for sharing
@vardaanbhave22315 күн бұрын
Dude, thanks a ton for explaining this so simply
@celismaroliveira60816 күн бұрын
Thank you so much! Another amazing tutorial!!!
@celismaroliveira60816 күн бұрын
That is the best explanation of Gini impurity I’ve ever seen! Even 8-year-old children can get it. Amazing! Congrats Luis Serrano/Serrano Academy!!
@vamsikrishna11317 күн бұрын
Awesome examples & explanation. TY
@user-bw5np7zz5m7 күн бұрын
I love your clear, non-intimidating, and visual teaching style.
@SerranoAcademy7 күн бұрын
Thank you so much for your kind words and your kind contribution! It’s really appreciated!
@woojay8 күн бұрын
So good. Thank you.
@tariqo67569 күн бұрын
wow, that was wonderful explanation. thatnks!
@wb77799 күн бұрын
Reading the book is really hard. It's hard to control the studying environment, and it's hard to sustain enough focus with the necessary momentum to learn a concept. Theres so many interruptions that can happen that can interfere with the process, but with this concise and powerful video it helped save a lot of time in learning. Thanks.❤
@Ramkumar-uj9fo9 күн бұрын
Thanks. I really had a FOMO. Best of luck❤
@tashfiashamim9 күн бұрын
What an intuitive explanation! Kudos to you!
@maheshBasavaraju9 күн бұрын
Your explanation is almost like an elementary school teacher. regression is addng and subtracting to slope and y-intercept. couldn't get easier than this. great!
@Ramkumar-uj9fo9 күн бұрын
Great finish! I have trained a few models. I understood conceptially.
@viro-jx2ft9 күн бұрын
This is the best ever video you will find on HMM. Complicated concepts handled soooo wellll🥰
@Ramkumar-uj9fo9 күн бұрын
Absolutely brilliant red blue glove experiment and how observation changes values. 🎉
@Ramkumar-uj9fo9 күн бұрын
Exciting. A conceptual journey. 🎉
@BigAsciiHappyStar9 күн бұрын
Muy BALL-issimo 😄 Loved the puns!!!!!😋😋😋
@jimbuckley910010 күн бұрын
Really enjoyable, really understandable intro to RNN. Loved it - many thanks. But there is a tiny error at 18:07 that might cause a little confusion for us newbies when doing the 'Add' layer calculations for the graphical version of the RNN: The output of the 'Add' layer should read 100121, not 010121
@nc258111 күн бұрын
Thank you for this fantastic video series on Transformers! The first two videos were particularly enlightening. I'm fascinated by how the query, key, and value vectors evolve before each attention module. It would be wonderful to gain a deeper understanding of the encoder-decoder architecture, particularly why the first attention belongs to the encoder while the subsequent ones are part of the decoder. Also, I'm intrigued by the visualization of linear transformations at each step during training, especially when outputs are recycled back into the decoder. Eagerly awaiting more insights!
@hoseinalavi391611 күн бұрын
Your explanation is so great. Keep going on my friend. I am waiting for your next video.
@sambitmukherjee171311 күн бұрын
Just a superb explanation!
@tianle46513 күн бұрын
Thank you for your great video, but how did the correlation between D and AC, E and B, come about? Why does this specific correlation form?
@andresfeliperiostamayo730713 күн бұрын
La mejor explicación que he visto sobre los Transformers. Gracias!
@RoyBassTube13 күн бұрын
Thanks! This is one of the best explanations of Q, K & V I've heard!
@youngvong390413 күн бұрын
really helpful!
@user-mz4kp5ot6g14 күн бұрын
Thanks for the Great video! Your content on Udacity is outstanding too!
@oceana.r54514 күн бұрын
india sok asik
@andrewkeeleyyonda14 күн бұрын
I'm so glad to know there are others who do this!!! I've been doing this pattern since I was 10 years old, and still do it obsessively in middle age. I see it as a pressure release for my brain--if I am stressed, anxious, or just don't have enough to occupy my brain, it activates this pattern in my shoulders, toes, any symmetrical body parts.
@BigAsciiHappyStar14 күн бұрын
13:32 "feel free to pause the video" reminds me of Chess KZbinr agadmator 🤣
@danmcclelland547615 күн бұрын
Here in 2024 and been trying to wrap my brain around this stuff for a couple years. Your vids have absolutely made it all click. Well done and many thanks!
@Ninjasharkcat15 күн бұрын
Wow excellent video, without any background on GMM I was able to understand the concept and logic behind it. Gracias!!