What is Word2Vec? A Simple Explanation | Deep Learning Tutorial 41 (Tensorflow, Keras & Python)

  Рет қаралды 173,895

codebasics

codebasics

Күн бұрын

Пікірлер: 128
@codebasics
@codebasics 2 жыл бұрын
Check out our premium machine learning course with 2 Industry projects: codebasics.io/courses/machine-learning-for-data-science-beginners-to-advanced
@mehmetbakideniz
@mehmetbakideniz Жыл бұрын
I started searching wordtovec videos after failing to understand it by following NG's lessons. That is the single video that can actually tell that the word embeddings are 'the side effects' of the training process and this is how it finally clicked for me. Thank you very much!
@anubhavthakur2985
@anubhavthakur2985 6 ай бұрын
then you didn't searchthe youtube enough
@ahmedgamberli2250
@ahmedgamberli2250 2 жыл бұрын
I like how you love your homeland and use it in all examples. Greetings and Love from Azerbaijan.
@sunilgundrai6464
@sunilgundrai6464 2 жыл бұрын
As part of my NLP dissertation, I was looking for some real time use cases with some clear explanation. I found this a super useful and thank you for great demonstration with so many examples which are easy to understand. You rock with your teaching skills!!
@girmayohannis4659
@girmayohannis4659 7 ай бұрын
nice to meet you here, in which university are you studying for your PhD?Thanks
@austinburcham4260
@austinburcham4260 2 ай бұрын
This was unbelievably helpful. This filled in the intuition gap behind word2vec that I was so desperately looking for.
@assafbotzer7952
@assafbotzer7952 Жыл бұрын
So clear, so eloquent, and so concise. Your contents are gift to this world. Thank you for using your intelligence, diligence and teaching skills to make a positive mark.
@fidaeharchli4590
@fidaeharchli4590 8 ай бұрын
i confirm
@raom2127
@raom2127 3 жыл бұрын
Presenting complex understand matter in an simplified way Dhaval Sir we you are an patience,consistent, simplified ,organised way of subject presentation expert.Basics-Theroy-Coding-Pratice..with..Great Explaination.
@sanjeebkumargouda1471
@sanjeebkumargouda1471 3 жыл бұрын
Great explanation .. 🙌🙌🙌 After watching many videos on this topic finally my understanding is cristal clear. You are doing awesome job sir.
@codebasics
@codebasics 3 жыл бұрын
I appreciate you leaving a comment of appreciation
@mimansamaheshwari4664
@mimansamaheshwari4664 2 жыл бұрын
One of the best videos on word2vec
@gayathrigirishnair7405
@gayathrigirishnair7405 2 жыл бұрын
This is the video that finally helped me grasp this concept. Thank You!
@robertcormia7970
@robertcormia7970 9 ай бұрын
This was a useful introduction, I don't have the math chops to understand it, but it was useful to hear some of these definitions.
@sumit121285
@sumit121285 3 жыл бұрын
you are the real teacher.....what should i say for you ???? thank you sir...thank you so much.........
@anonymous-or9pw
@anonymous-or9pw 10 ай бұрын
He played it really well when he marked male = -1
@yonahcitron226
@yonahcitron226 2 жыл бұрын
incredible content. this guy is one of the best on youtube
@codebasics
@codebasics 2 жыл бұрын
I appreciate you leaving a comment of appreciation
@Sunilgayakawad
@Sunilgayakawad Жыл бұрын
Crystal clear explanation!! Thanks you so much sir
@vgreddysaragada
@vgreddysaragada Жыл бұрын
Super explanation ..Thank you so much
@SoftRelaxingAndCalmMusicNature
@SoftRelaxingAndCalmMusicNature Жыл бұрын
Well done. This is one of the best course on word2vec so far. I do have a master degree in AI and event that I did not work professionaly in the field your cour brough a lot of souvenirs haaa.. During my master 15 years ago I introduced an archaich method for resolving question/answering based on linkgramar, wordnet, verbnet and semnet. At the end of my syntactical analysis I also discovered that by just using world context it was possible to comme up with a vector representation of named entities.. The innovation is here is the use of neural network to give a value to the world. This is just brilliant. In my thesis I was already showing that language is just a code representing a subjective version of one universe and that human and animal comunicate using theirs own code.
@PremKumar136
@PremKumar136 2 жыл бұрын
Awesome explanation. Crystal Clear.
@kanisrini01
@kanisrini01 5 ай бұрын
Amazing Video 👏🌟. Thank you so much for the great explanation
@619vijay
@619vijay 4 ай бұрын
Very useful and informative
@farnoushnazary7295
@farnoushnazary7295 4 ай бұрын
Awesome explanation. Thanks!
@overthrowenjoyer
@overthrowenjoyer 2 ай бұрын
What a great video and what a great explanation! ❤
@moni1122331
@moni1122331 2 жыл бұрын
great teacher, great explanation, great presentation, great context
@fidaeharchli4590
@fidaeharchli4590 8 ай бұрын
thank you
@tagoreji2143
@tagoreji2143 2 жыл бұрын
Good Explanation Sir.Thank you
@kimdaeeun6683
@kimdaeeun6683 Жыл бұрын
Easy explanation!! Tks much👍👍
@ledinhanhtan
@ledinhanhtan 9 ай бұрын
Mind blowing 🤯🤯 Thank you!
@javierlopezcampoy5951
@javierlopezcampoy5951 Жыл бұрын
Great explanation! Thank you very much
@notknown9307
@notknown9307 3 жыл бұрын
thanx we are learning a lot from you
@codebasics
@codebasics 3 жыл бұрын
Glad it was helpful!
@notknown9307
@notknown9307 3 жыл бұрын
@@codebasics waiting for your next upload you are doing your work very well👍👍
@dhirajkumarsahu999
@dhirajkumarsahu999 3 жыл бұрын
Great Visual way of teaching! Thank you so much Sir ❤️
@shivav7379
@shivav7379 2 жыл бұрын
A Very good explaination - really very helpful
@moutazalreesh1763
@moutazalreesh1763 Күн бұрын
Thank you so much for this amazing explanation! . I was wondering if it's possible to download the PowerPoint slides used in this course. Could you please let me know where I can access them?
@abir95571
@abir95571 3 жыл бұрын
There's a subtle mistake in your CBOW explanation at 8:34 . In CBOW the target is always the central word based on context i.e the surrounding word . That means for a substring "Emperor ordered his" and window size of 3 the target is "ordered" and features are "Emperor , this"
@ashwinshetgaonkar6329
@ashwinshetgaonkar6329 2 жыл бұрын
so he explained skip gram
@ashwinshetgaonkar6329
@ashwinshetgaonkar6329 2 жыл бұрын
so he explained skip gram
@abir95571
@abir95571 2 жыл бұрын
@@ashwinshetgaonkar6329 yes
@BARaaz04
@BARaaz04 3 жыл бұрын
Very good explanation. Thanks.
@namansethi1767
@namansethi1767 3 жыл бұрын
Thank you Sir for this playlist
@neerajashish7042
@neerajashish7042 3 жыл бұрын
Approximately how many videos are going to come in this series except the existing videos, by the way thanks a lot sir, the only playlist on youtube which was way more knowledgeable for machine learning and deep learning..
@codebasics
@codebasics 3 жыл бұрын
There will be atleast 5 to 10 videos coming up and then I will start the project series
@ashwinivalmiki7636
@ashwinivalmiki7636 3 жыл бұрын
Hello sir, Please make a video on GRE and IELTS preparation , this will be more useful and helpful to students like me planning to study Masters Abroad as your videos are clear, we get motivated . Thank you.
@vishaldas6346
@vishaldas6346 3 жыл бұрын
Also what would be your next topic in deep learning, is it sequence to sequence models?
@ShahabShokouhi
@ShahabShokouhi 9 ай бұрын
I was watching Andrew Ng's course on sequence models and his lecture on word2vec is just a bullshit. Thanks god I found your video, amazing explanation.
@ashokkonatham8857
@ashokkonatham8857 3 жыл бұрын
Wow, very very clear . Thank you 🙏
@codebasics
@codebasics 3 жыл бұрын
Glad it was helpful!
@vinaykumardaivajna5260
@vinaykumardaivajna5260 Жыл бұрын
Great explanation as always
@list10001
@list10001 2 жыл бұрын
Thank you! The explanation was very clear.
@shubhamwaingade4144
@shubhamwaingade4144 2 жыл бұрын
Awesome explanation of the concept!
@pradeept328
@pradeept328 10 ай бұрын
Great explanation
@vishaldas6346
@vishaldas6346 3 жыл бұрын
I think Dhaval, there is no non-linear activation function between the input layer and hidden layer. Correct me if I am wrong.
@darshangangurde7855
@darshangangurde7855 3 жыл бұрын
thanks a lot...holly great..pls complete the playlist asap
@manikant1990
@manikant1990 3 жыл бұрын
superbly explained !!
@vikaspatildod
@vikaspatildod 2 жыл бұрын
Beautiful video
@minruili4789
@minruili4789 3 жыл бұрын
Fantastic explanation!
@pretomghosh6231
@pretomghosh6231 4 ай бұрын
This COBW and Skipgram is kind of encoding+decoding architecture like the autoencoders if I am not wrong?
@notknown9307
@notknown9307 3 жыл бұрын
Excited 😄
@madhu1987ful
@madhu1987ful 3 жыл бұрын
Awesome man...loved it...can you pls upload some code walk through of this concept -- some gud projects
@harshvardhanagrawal
@harshvardhanagrawal 4 ай бұрын
Where do we get the predicted output from? How do we enter it for comparison?
@phil97n
@phil97n 11 ай бұрын
Awesome thank you
@taufiqulhaque4987
@taufiqulhaque4987 3 жыл бұрын
would you please create a playlist on NLP?
@lohitsalavadhi6912
@lohitsalavadhi6912 3 жыл бұрын
Great explained finally
@codebasics
@codebasics 3 жыл бұрын
🙏🙏
@trendyjewellery1987
@trendyjewellery1987 6 ай бұрын
Superb
@wenzhang5879
@wenzhang5879 2 жыл бұрын
I think you mean 'side products' rather than 'side effect'?
@BeradinhoYilmaz
@BeradinhoYilmaz Жыл бұрын
is there standart real list for every onject given here. For example for cats, tails 0.2?
@wp1300
@wp1300 11 ай бұрын
7:20 Meaning of word can be inferred by surrounding words
@bibhupadhy4155
@bibhupadhy4155 2 жыл бұрын
Great Explanation :) Crisp and to the point , Better than Hrithik Roshan Super Hero Movie's Explanation :P :P
@ChaitraC9191
@ChaitraC9191 2 жыл бұрын
Hello I have doubt in this explanation, aren't all the weights gonna be same when our neural network is trained ? what I mean is once we train a network W(T)X is what triggers a output node so how do we have different weights for every output word
@cherupawan3777
@cherupawan3777 Жыл бұрын
Did u got answer to this
@djelloulbouchiha-cunaamaal7848
@djelloulbouchiha-cunaamaal7848 2 жыл бұрын
We need a course about NLP Transformers..
@thurakyawnyein6113
@thurakyawnyein6113 7 ай бұрын
superb..
@Cooldude5786
@Cooldude5786 Жыл бұрын
The statement "King - man + woman = Queen" is well-known in machine learning. However, when we examine the characteristics of a king, they often include being super rich, having authority, and possibly not having a tail. Yet, there is a contradiction: a lion is also referred to as a king, and it does have a tail. How can a computer differentiate between a human king and an animal king? Doesn't this introduce bias since the training corpus typically associates "king" with humans rather than animals? Just because something appears less frequently or is absent from the corpus doesn't mean it lacks value or significance.
@lisali6205
@lisali6205 2 жыл бұрын
you are the best
@imanqoly
@imanqoly Жыл бұрын
The more you dig deeper into a thing, the greater the tutor gets
@amanagrawal4198
@amanagrawal4198 4 ай бұрын
I think there's a mistake , because in both cbow and skip gram , the weights that made the embdeddings are always between input and hidden layer , and here in cbow you mentioned the weights between hidden and output are considered.
@amanagrawal4198
@amanagrawal4198 4 ай бұрын
CBOW Model Architecture Review The CBOW architecture works by predicting a target word based on the context words around it. Here’s a step-by-step explanation of the flow: Input Layer: This consists of several one-hot encoded vectors corresponding to the context words. Projection Layer (Hidden Layer): Each one-hot vector is used to retrieve a word embedding from the first weight matrix (input-to-hidden weights). Unlike typical neural networks, there is no activation function here; the embeddings are simply summed or averaged (depending on the implementation) to produce a single dense vector. This vector represents the combined semantic content of the context words. Output Layer: The averaged embedding vector is then projected to the output layer using a second set of weights (hidden-to-output weights). The output layer is a softmax layer that predicts the probability distribution over the vocabulary for the target word. Role of Weights in Embedding Formation Input-to-Hidden Weights: This is essentially the embedding matrix. Each row in this matrix corresponds to the embedding of a word in the vocabulary. When context words are fed into the model, their embeddings are retrieved by indexing this matrix with the one-hot vectors. These embeddings are what you typically extract and use as pre-trained embeddings for other tasks. Hidden-to-Output Weights: These weights are used to transform the combined embedding from the hidden layer into a prediction for the target word. Each column in this matrix (since it's typically the transpose of the embedding matrix in many implementations) can be seen as a "contextual embedding" of a word when it acts as a target.
@yasaswinigollapally7603
@yasaswinigollapally7603 2 жыл бұрын
Sir your video is awesome 🙌,i have one doubt ,what is the main difference between skip gram and bag of words model?
@prasanth123cet
@prasanth123cet 2 жыл бұрын
Will be get nearly identical word vectors for CBOW and skim gram methods for a particular word say 'king'?
@thamizharasim5970
@thamizharasim5970 2 жыл бұрын
Thanks a lot 😌
@rahulsoni412
@rahulsoni412 3 жыл бұрын
Thanks a lot for explaining this using a neural network diagram :)
@codebasics
@codebasics 3 жыл бұрын
🙂👍
@rahulsoni412
@rahulsoni412 3 жыл бұрын
@@codebasics can you explain how the number of weights are calculated in word embedding, I mean the number of total weights. I was getting confused while calculating the number of weights.
@prasannan-robots
@prasannan-robots 3 жыл бұрын
Thanks for this awesome tutorial waiting for coding part :)
@anpowersoftpowersoft
@anpowersoftpowersoft 5 ай бұрын
Amazing
@uwaisahamedimad556
@uwaisahamedimad556 2 жыл бұрын
Hi, it is a wonderful explanation for word2vec I've ever seen.I have a question,I have my own corpus and I have built multiple wor2vec models, How to evaluate these models and how am I gonna choose the best one???
@codebasics
@codebasics 2 жыл бұрын
One approach is to take a classification or some other NLP problem in your domain and build NLP classification model using your embeddings. You can then check the performance of those models to evaluate how effective embeddings are
@uwaisahamedimad556
@uwaisahamedimad556 2 жыл бұрын
@@codebasics thanks a lot for the reply. based on your answer it seems like there is no standard or at least a well-established evaluation method for the performance of word embeddings.
@jongcheulkim7284
@jongcheulkim7284 3 жыл бұрын
Thank you.
@lemoniall6553
@lemoniall6553 2 жыл бұрын
Is word2vec using dimensional reduction too?
@anandakhan4160
@anandakhan4160 2 жыл бұрын
sir, how do u unzip the json file using git bash , is not clear to me. help me plz. thanks.
@arjunbali2079
@arjunbali2079 2 жыл бұрын
Thanks sir
@kmnm9463
@kmnm9463 3 жыл бұрын
Hi Dhaval, Great video on W2V, The link for the coding part of implementing Word2Vec in Python, please?
@codebasics
@codebasics 3 жыл бұрын
Yes that video is coming up soon. I have not yet uploaded it
@bii710
@bii710 2 жыл бұрын
That was a great explanation. Thanks. I have this one question in my mind. If all words in documents are unique then how word2vec will find vector for the last 2 words? Considering cbow
@vitocorleone1991
@vitocorleone1991 2 жыл бұрын
Brilliant
@akshansh_00
@akshansh_00 6 ай бұрын
bam! life saver
@umerfarooque6373
@umerfarooque6373 8 ай бұрын
How to evaluate a word2vector model
@ibrahemnasser2744
@ibrahemnasser2744 2 жыл бұрын
What a mathematician would do when he/she hear you say "a vector is nothing but a set of numbers"
@sebinsaji9573
@sebinsaji9573 3 жыл бұрын
Can you say about cyber security scopes skills
@amanbajaj7591
@amanbajaj7591 Жыл бұрын
wheere is neural network link?
@houchj0372
@houchj0372 3 жыл бұрын
doesn't CBOW mean Contextual Bag of Words?
@codebasics
@codebasics 3 жыл бұрын
Continuous Bag Of Words: analyticsindiamag.com/the-continuous-bag-of-words-cbow-model-in-nlp-hands-on-implementation-with-codes/
@houchj0372
@houchj0372 3 жыл бұрын
@@codebasics you are correct, thank you. By the way, this video is excellent.
@PavanKumar-bk1sz
@PavanKumar-bk1sz 3 жыл бұрын
Can I get an admission in bsc data science after 12th commerce in St Xavier's College Mumbai ???? and I've mathmatics in optional subject ??? please please please please please please please please please tell me I've been requesting you for 6 months 🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏
@humanardaki7911
@humanardaki7911 2 жыл бұрын
working?
@johnnysaikia2439
@johnnysaikia2439 11 ай бұрын
King of the jungle has tail though
@amitmishra5474
@amitmishra5474 Жыл бұрын
Lion King has tail 😅
@mmenjic
@mmenjic 3 жыл бұрын
3:56 why horse and woman are same gender for start ????? then king minus men is gender -2 adding a woman or horse to that you get gender -1 which is men or king !?????
@priyeshsrivastava8025
@priyeshsrivastava8025 2 жыл бұрын
no its (-1) - (-1) + (+1) = +1 i.e. queen
@pawansinha8442
@pawansinha8442 Жыл бұрын
but in case of king of jungle that is lion, he has a tail,😃 just saying...
@RePuLseHQKing
@RePuLseHQKing 3 жыл бұрын
3:35 paygap lmao
@santoshsaklani5019
@santoshsaklani5019 2 жыл бұрын
Kindly make video on vulnerability prediction using wordtovec
@mubashiraqeel9332
@mubashiraqeel9332 9 ай бұрын
the thing is your all videos are connected to previous I am unable to watch a whole video you always made me pause and watch a previous video that's really a problem first i was watching the text classification video you said go watch bert first then in that video you said go watch word2vec then you said go watch part 1 first then now in this video you said go watch neural network now tell do you really want me to watch a whole video because i am just opening a new tab repitively.
@danielbrockerttravel
@danielbrockerttravel 3 ай бұрын
This headline is a complete lie that is not how embedding work and it’s highly unethical
@greenweed3253
@greenweed3253 3 ай бұрын
can you recommend other sources then?
@danielbrockerttravel
@danielbrockerttravel 3 ай бұрын
@@greenweed3253 Statquest teaches it more responsibly. He specifically told me he avoided the king-man + woman = Queen example because it simply isn't how word vectors function in practice.
@shivangiawasthi9388
@shivangiawasthi9388 Жыл бұрын
found a better explanation here - kzbin.info/www/bejne/gJ7Ik5SXpaaWgc0
@PavanTripathi-rj7bd
@PavanTripathi-rj7bd Жыл бұрын
Great explanation!
@debatradas9268
@debatradas9268 3 жыл бұрын
thank you so much
小路飞还不知道他把路飞给擦没有了 #路飞#海贼王
00:32
路飞与唐舞桐
Рет қаралды 88 МЛН
Симбу закрыли дома?! 🔒 #симба #симбочка #арти
00:41
Симбочка Пимпочка
Рет қаралды 4,5 МЛН
За кого болели?😂
00:18
МЯТНАЯ ФАНТА
Рет қаралды 3 МЛН
A Complete Overview of Word Embeddings
17:17
AssemblyAI
Рет қаралды 113 М.
Word2Vec - Skipgram and CBOW
7:21
The Semicolon
Рет қаралды 193 М.
Vectoring Words (Word Embeddings) - Computerphile
16:56
Computerphile
Рет қаралды 297 М.
Lecture 2 | Word Vector Representations: word2vec
1:18:17
Stanford University School of Engineering
Рет қаралды 510 М.
Лекция. Контестные эмбеддинги. Word2Vec.
29:12
Deep Learning School
Рет қаралды 16 М.
Word Embedding and Word2Vec, Clearly Explained!!!
16:12
StatQuest with Josh Starmer
Рет қаралды 338 М.
What are Word Embeddings?
8:38
IBM Technology
Рет қаралды 18 М.
小路飞还不知道他把路飞给擦没有了 #路飞#海贼王
00:32
路飞与唐舞桐
Рет қаралды 88 МЛН