Рет қаралды 15,324
Positional Encoding is a technique used in transformers to inject information about the position of tokens in a sequence. Since transformers lack inherent sequence order awareness, positional encodings enable the model to capture the order of words, crucial for understanding context. These encodings are added to the input embeddings, allowing the model to process and learn relationships based on token positions.
Blog link - blog.timodenk.com/linear-rela...
============================
Did you like my teaching style?
Check my affordable mentorship program at : learnwith.campusx.in
DSMP FAQ: docs.google.com/document/d/1O...
============================
📱 Grow with us:
CampusX' LinkedIn: / campusx-official
CampusX on Instagram for daily tips: / campusx.official
My LinkedIn: / nitish-singh-03412789
Discord: / discord
E-mail us at support@campusx.in
✨ Hashtags✨
#datascience #positionalencoding #campusx #deeplearning
⌚Time Stamps⌚
00:00 - Intro
01:18 - Why Positional encoding is required?
07:58 - Proposing a simple solution
22:17 - The sine function as a solution
34:26 - Explaining Positional encoding
55:25 - Interesting Observations
01:04:06 - Mind blowing solution
01:10:13 - Blog for mathematical intuition
01:12:45 - Outro