I must have watched like a million videos on Transformers but this is the first time i have completely understood it.
@LeonPetrou2 ай бұрын
So glad to hear that! Let me know what tutorial video you'd like me to make next.
@sreelakshmi7472Күн бұрын
I'm so glad I found this video. Great explanation, Leon!
@WangHongGuang20122 күн бұрын
I watched several videos on this topic, this video is definitely outstanding. The explanation is clear and easy to follow. So thankful!!
@Muddzdk2 күн бұрын
I watched many videos on the topic and this one was the easiest to follow. Visuals, animations, analogies and real world examples helps a lot, so keep using those in your videos.
@LeonPetrou2 күн бұрын
Thank you so much for the feedback! I'll make more videos like this.
@ravindranshanmugam7829 ай бұрын
Excellent, went thro' multiple videos on basic understanding of Transformers. This is the best one I could quickly grasp. Effortlessly explained, Well done !!
@LeonPetrou9 ай бұрын
Thank you Ravindran! I try my best to teach things the same way that I'd like to be taught, which is simple and step-by-step. Let me know what other videos you'd like to see from my channel.
@ravindranshanmugam7829 ай бұрын
Hi Leon, it would be great if you can make videos on Langchain and its application which are trending now. You can also add topics like Vectordatabase, Embedding, word2vec and so on. Anything on GenAI is hot now in tech space. Thanks.
@ovidioe.cabeza47507 ай бұрын
Same for me, I am a python backend dev and getting transformer was being tough, but you helped me a lot, thank you!
@Yaser-z9j6 ай бұрын
Me too@@ravindranshanmugam782
@NithishAnuth5 күн бұрын
Went through many video on transformer but this one THE BEST !!
@LeonPetrou4 күн бұрын
Appreciate that! What video should I make next?
@nimitbhandari28592 ай бұрын
Finally a simple and terse explanation for transformers, loved it! ❤😊
I believe that this is the best video for transformers, embeddings and tokenisation on the internet!!!
@LeonPetrou5 ай бұрын
Appreciate that! Let me know what tutorial you want to see next!
@atharvadeshpande497629 күн бұрын
Thank you for the explanation. Too good.
@60pluscrazyАй бұрын
Best explanation 🎉🎉🎉
@fhs1464723 күн бұрын
very good explained, thank you so much!
@michaelzap85287 ай бұрын
best. Finally i understand how gpt work now. Thanks male, u the champion.
@marcinnnWL4 ай бұрын
Best material for me - easy explanation in my way of thinking. Thanks Leon. :)
@LeonPetrou4 ай бұрын
Appreciate that!
@sukumarane23023 ай бұрын
You made it simple … really excellent!. Thanks 🙏
@monicabhogal2 ай бұрын
Awesome , beautifully explained ❤
@rajathslr5 ай бұрын
You have put a lot of effort for this wonderful video, thank you so much
@ravideepa4 ай бұрын
Excellent tutorial on this concept. Awesome 👏
@vj76687 ай бұрын
Excellent !!! Thanks for simplifying it. Loved it !
@LeonPetrou7 ай бұрын
Appreciate that, thank you!
@KonstantinosEvangelides6 ай бұрын
Can you do a separate video exploring further what are embeddings and what does the vector embeddings represent more thoroughly. Great video!!
@LeonPetrou6 ай бұрын
Great idea! I'll do this next.
@sudhanshusaxena81348 ай бұрын
Great explanation.
@LeonPetrou8 ай бұрын
Thank you very much!
@anibeto79 ай бұрын
It was indeed a very informative video. It cleared a lot of the important ideas. Thanks a lot.
@DarrabEducation4 ай бұрын
It's amazing, more such videos will make you in the top with another video as an example.
@F30-Jet5 ай бұрын
6:56 to add more clarity on Pretrained; Pretrained means the model has acquired general-purpose knowledge.
@programminglover29767 ай бұрын
thank you so much.. really reallly well explained.
@JRKyt006 ай бұрын
Agreed--best explanation I've found. Now I get it (well...)!
@MotulzAnto8 ай бұрын
THANK YOU! easy explanation..
@LeonPetrou8 ай бұрын
Appreciate it!
@crazyant10804 ай бұрын
Thanks a lot.
@PigiMontieri2 ай бұрын
Wow thanks ♥️
@karannesh77007 ай бұрын
thx for this great video !
@LeonPetrou7 ай бұрын
Appreciate it!
@JohnCohen-ur5hk7 ай бұрын
Very Good Explanation. Thank You
@Clammer9998 ай бұрын
Wow, this is one of the easiest to understand video on how transformers work. You also explained very tokens and embeddings which I was searching for. I’m a complete newbie and I kept hearing nuerons and neural networks. Is a neuron a physical device/hardware or it actually an algorithm? And a neural network is not a physical network?
@LeonPetrou8 ай бұрын
Thank you! Neural networks, and everything explained in this video is all software (except biological neurons which is in a human brain), it is all algorithms. It's basically just code. The hardware that the code runs on usually just requires high processing power / RAM. This can be a CPU or GPU.
@otenyop5 ай бұрын
Great explanation
@samrmit62535 ай бұрын
brilliant video thanks
@Omniassassin710 ай бұрын
This is amazing, thanks a lot man! Quick question, how are the self-attention layers produced? Does the model dynamically “decide” which contextual layer to use depending on the prompt, or is the set of layers learnt during training?
@LeonPetrou10 ай бұрын
My pleasure man, glad you like it. That's a great question. The structure and behavior of these self-attention layers are determined during the model's training phase, not during inference. Simply put, the model learns which words in a sentence should pay attention to which other words to better understand the sentence's meaning. This learning process is fixed once the model is fully trained.. it does not change or decide on a different structure when it's given new prompts to process.
@changliu75534 ай бұрын
@@LeonPetrou Thanks. I am starting to think the attention could be used for google search - this way we don't have to use stupid SEO's. The same question can be asked 1000 different times.
@Yaser-z9j6 ай бұрын
Awesome 👌 thank you so much, You are amazing
@baigsaab474 ай бұрын
Hi Leon. Can you kindly make a video explaining LLM's library vLLM.
@poorjahangiri11Ай бұрын
vey well done!
@LeonPetrouАй бұрын
appreciate it!
@AvaMichl5 ай бұрын
Does one word always equal one token embedding?
@LeonPetrou5 ай бұрын
@@AvaMichl no not always, it’s just an simple way to think about it, but on average, one token is 4 characters of text.
@GilCohen-z5t4 ай бұрын
It would be great to get the slides :)
@LeonPetrou4 ай бұрын
Sure, why would you like the slides?
@GilCohen-z5t4 ай бұрын
@@LeonPetrou I'm giving a talk at my small data startup on GenAi. I was hoping to incorporate some of your fantastic work from this video into my presentation. I’ll be sure to give you full credit and direct people to your video.
@LeonPetrou4 ай бұрын
@@GilCohen-z5t Sure, happy to share the slides with you. I'd appreciate the traffic to the video. What email would you like me to send the slides to?
@prathamsinghjamwal4725Ай бұрын
Sir could you also provide the pdf of this video
@kamal99919997 ай бұрын
This video is a lot better one ☝️
@LeonPetrou7 ай бұрын
Appreciate that!
@rhktech7 ай бұрын
very well explained (Y)
@changliu75534 ай бұрын
After watching a bunch of videos, I think yours clarifies many things! Thank you. Question here. You lost me between "the fisherman caught the fish with the net" and "the cat is sleeping". They have connection? If you are trying to translate to another language, I can understand. But why does the GPT say "The cat is ...."? what was the input there? Thanks
@LeonPetrou4 ай бұрын
In that example "The cat is" is the input/prompt.
@abooaw45889 ай бұрын
Bravo 🇨🇵Dommage que ce très bon niveau de d'explication n'est réservé que pour nous qui comprenons l'anglais. Lecun et Bengio en sont pour beaucoup. Heureusement que le nutshell n'est pas traduit par GPT à la noix!
@LeonPetrou9 ай бұрын
Merci beaucoup for your thoughtful comment! I'm glad you found the video informative. Your point about language accessibility is very important to us. We're actively exploring options to include subtitles in multiple languages in our future videos to ensure more viewers can benefit from our content.
@Bachanginh6 ай бұрын
cool man, im from vietnam
@najlaalhamdan73505 ай бұрын
THE BEST!!!
@LeonPetrou5 ай бұрын
Thank you!
@d960027 ай бұрын
not 175 trillion parameters but 1.75 trillion
@LeonPetrou7 ай бұрын
Thanks for clarifying, my bad.
@NavdeepVarshney-ep4ck8 ай бұрын
Sir are u a researcher or ml enthusiast
@LeonPetrou8 ай бұрын
I'm a ml enthusiast with an engineering background. :)
@Keshi-lz3ef10 ай бұрын
Great session!
@LeonPetrou10 ай бұрын
Thank you!
@dragonwood-hc4sw7 ай бұрын
Ed Stafford?
@LeonPetrou7 ай бұрын
I see it! haha
@MaduraiKallan7 ай бұрын
1.76 trillion for GPT 4
@LeonPetrou7 ай бұрын
indeed, thanks for clarifying!
@changliu75534 ай бұрын
Did someone actually made the "attention weight" table? Say "Trump" and "Chair"? I think your video suggest someone might have done it.