Self-attention mechanism explained | Self-attention explained | scaled dot product attention

  Рет қаралды 878

Unfold Data Science

Unfold Data Science

Ай бұрын

Self-attention mechanism explained | Self-attention explained | self-attention in deep learning
#ai #datascience #machinelearning
Hello,
My name is Aman and I am a Data Scientist.
All amazing data science courses at the most affordable price here: www.unfolddatascience.com
Book one on one session here(Note - These supports are chargable): docs.google.com/forms/d/1Wgle...
Follow on Instagram: unfold_data_science
About Unfold Data science: This channel is to help people understand the basics of data science through simple examples in an easy way. Anybody without prior knowledge of computer programming or statistics or machine learning and artificial intelligence can get an understanding of data science at a high level through this channel. The videos uploaded will not be very technical in nature and hence can be easily grasped by viewers from different backgrounds as well.
Book recommendation for Data Science:
Category 1 - Must Read For Every Data Scientist:
The Elements of Statistical Learning by Trevor Hastie - amzn.to/37wMo9H
Python Data Science Handbook - amzn.to/31UCScm
Business Statistics By Ken Black - amzn.to/2LObAA5
Hands-On Machine Learning with Scikit Learn, Keras, and TensorFlow by Aurelien Geron - amzn.to/3gV8sO9
Category 2 - Overall Data Science:
The Art of Data Science By Roger D. Peng - amzn.to/2KD75aD
Predictive Analytics By By Eric Siegel - amzn.to/3nsQftV
Data Science for Business By Foster Provost - amzn.to/3ajN8QZ
Category 3 - Statistics and Mathematics:
Naked Statistics By Charles Wheelan - amzn.to/3gXLdmp
Practical Statistics for Data Scientist By Peter Bruce - amzn.to/37wL9Y5
Category 4 - Machine Learning:
Introduction to machine learning by Andreas C Muller - amzn.to/3oZ3X7T
The Hundred Page Machine Learning Book by Andriy Burkov - amzn.to/3pdqCxJ
Category 5 - Programming:
The Pragmatic Programmer by David Thomas - amzn.to/2WqWXVj
Clean Code by Robert C. Martin - amzn.to/3oYOdlt
My Studio Setup:
My Camera: amzn.to/3mwXI9I
My Mic: amzn.to/34phfD0
My Tripod: amzn.to/3r4HeJA
My Ring Light: amzn.to/3gZz00F
Join the Facebook group :
groups/41022...
Follow on medium: / amanrai77
Follow on quora: www.quora.com/profile/Aman-Ku...
Follow on Twitter: @unfoldds
Watch the Introduction to Data Science full playlist here: • Data Science In 15 Min...
Watch python for data science playlist here:
• Python Basics For Data...
Watch the statistics and mathematics playlist here :
• Measures of Central Te...
Watch End to End Implementation of a simple machine-learning model in Python here:
• How Does Machine Learn...
Learn Ensemble Model, Bagging, and Boosting here:
• Introduction to Ensemb...
Build Career in Data Science Playlist:
• Channel updates - Unfo...
Artificial Neural Network and Deep Learning Playlist:
• Intuition behind neura...
Natural language Processing playlist:
• Natural Language Proce...
Understanding and building a recommendation system:
• Recommendation System ...
Access all my codes here:
drive.google.com/drive/folder...
Have a different question for me? Ask me here : docs.google.com/forms/d/1ccgl...
My Music: www.bensound.com/royalty-free...

Пікірлер: 13
@jayeshsingh116
@jayeshsingh116 Ай бұрын
well explained thank you for covering these topics
@ajitkulkarni1702
@ajitkulkarni1702 Ай бұрын
Best expalinination on self attention !!!
@dinu9670
@dinu9670 28 күн бұрын
You are a saviour man. Great explanation. Please keep doing these videos 🙏
@UnfoldDataScience
@UnfoldDataScience 27 күн бұрын
Thanks, will do!
@user-kt5qu2ne7l
@user-kt5qu2ne7l Ай бұрын
Thank you for your clear explanation
@AnkitGupta-rj4yy
@AnkitGupta-rj4yy Ай бұрын
Thank you for provide us ❤ in easy way
@irfanhaider3021
@irfanhaider3021 16 күн бұрын
Kindly make a video on GRU layer as well.
@funwithtechnology6526
@funwithtechnology6526 27 күн бұрын
Thank you for the very clear explanation :) . I have a small question here. In self-attention, is there a limit to the dimension of the final attention embedding space?
@dhirajpatil6776
@dhirajpatil6776 Ай бұрын
Please made video on explanation of transformers architecture
@ajitkulkarni1702
@ajitkulkarni1702 Ай бұрын
Please make viodes on multi head attention...
@manoj1bk
@manoj1bk Ай бұрын
can be used as self attention mechanism(as an embedding layer) before LSTM in the context of time series analysis?
@RakeshKumarSharma-nc3cj
@RakeshKumarSharma-nc3cj Ай бұрын
awesome video
@UnfoldDataScience
@UnfoldDataScience 29 күн бұрын
Thanks Rakesh
Self-Attention Using Scaled Dot-Product Approach
16:09
Machine Learning Studio
Рет қаралды 13 М.
$10,000 Every Day You Survive In The Wilderness
26:44
MrBeast
Рет қаралды 139 МЛН
Increíble final 😱
00:37
Juan De Dios Pantoja 2
Рет қаралды 82 МЛН
Balloon Stepping Challenge: Barry Policeman Vs  Herobrine and His Friends
00:28
New Gadgets! Bycycle 4.0 🚲 #shorts
00:14
BongBee Family
Рет қаралды 18 МЛН
A Path Towards Autonomous Machine Intelligence with Dr. Yann LeCun
1:03:05
AFOSR, Air Force Office of Scientific Research
Рет қаралды 18 М.
Cross Attention | Method Explanation | Math Explained
13:06
The Attention Mechanism in Large Language Models
21:02
Serrano.Academy
Рет қаралды 80 М.
$10,000 Every Day You Survive In The Wilderness
26:44
MrBeast
Рет қаралды 139 МЛН