In my opinion, best explanation so far of positional encoding! Super clear and concise! Thank you very much sir!
@cybermanaudiobooks32315 ай бұрын
The best explanation of transformer positional encoding on the internet. Awesome video. Thanks!
@atabhatti28443 ай бұрын
Great explanation. Short enough. Detailed enough. Enough talking. Enough showing. Loved the examples.
@Wesleykoonps6 ай бұрын
I like very concise graphical explanation with the similarity to binary coding and basic linear algebra!
@AdhbuthamАй бұрын
Incredible! I have surfed through various resources online and no one got this so accurately. Absolutely spot on explanation.
@marcinstrzesak3464 ай бұрын
I couldn't find anywhere why creators of transformer decied to encode the positions in this way and last minute of your video was what I was looking for. Thanks for good explanation
@markburton5318Ай бұрын
It seems the addendum is a 5th requirement. I can’t word this precisely but the positional encoding can be learned easily, that the embedding is only a linear transformation of position. It cannot be an encryption of the token.
@mohammedelsiddig39393 ай бұрын
I'm eternally grateful for this concise explanation, other sources made the positional encoding concept sound so counter-intuitive to grasp
@ea_7774 ай бұрын
Just when I was about to pull the last hair on top of my head, I came across this video. Beautifully Explained. Thank You !
@JesseNerio5 ай бұрын
Fantastic. This was amazing! Best explanation.
@prabhuramnagarajan1893Ай бұрын
please explain in detail about the linear relation with two encoding. You mathematical proofs, sounds excellent. Please recommend a good book to understand in detail about these concepts.
@simaogoncalves19574 ай бұрын
Keep these coming!
@temanangka38203 ай бұрын
1. Why positional encoding is added to the word embedding? Will it changes the semantic value? 2. Why positional encoding use random number produce by sin and cosine... I think it must be simple if we add the one dimension to word embedding storing the position as integer. Why use such a hard, random, and unpredictable algorithm to encode positions!
@gart160914 күн бұрын
Why do we need to alternate sine and cosine? It seems like either one on its own should do the job. The only reason I can see for alternations is that this way we can solve the problem of positional encoding with the wavelength twice as short, as opposed to sine or cosine alone. Is that right? Are there other reasons?
@phobosmoon4643Ай бұрын
bravo
@wilfredomartel7781Ай бұрын
❤
@temanangka38203 ай бұрын
How can adding positional encoding to word embedding doesnt change the word semantic meaning? Example: Word embedding of "Cat" is [1, 2, 3], Word embedding of Money is [2, 3, 4]. If the positional encoding is [2, 1, 0] for word "Cat", positional encoding for word "Money" is [1, 0, -1] then the positional encoded of both word is [3, 3, 3] How can "Cat" equal to "Money"?
@BrainDrain90003 ай бұрын
Because positional part is a constant. Token part is stochastic, it changes depending on current token, but positional part remains the same. Imagine that you recorded all embeddings of a 0th token from the whole dataset and you got a map, distribution. If you add some constant, this map will remain the same, but shifted to some other location. And yes, it will not work for two examples, you need sufficient amount of data to prevent confusion.
@temanangka38202 ай бұрын
@@BrainDrain9000 I see... 🔥 Thank you ✅
@ronin2963Ай бұрын
Can you project your speak. You asmr tone is disturbing