Rasa Algorithm Whiteboard - Understanding Word Embeddings 3: GloVe

  Рет қаралды 9,953

Rasa

Rasa

Күн бұрын

Пікірлер: 21
@faangsde
@faangsde 4 жыл бұрын
One of the best channels/videos regarding NLP. You strike the balance between video length and in-depth explanation of every topic. Also, the video is easy to follow and instructive.
@r5bc
@r5bc 4 жыл бұрын
I saw the master class play list and i saw all the videos in this this one too. And i want to thank you for the amazing work, the extremely big value you provide to us and the light you shed on the topic . I can't wait to see the next video. Please keep up the good work 👍👍👍👍
@anupammitra
@anupammitra 3 жыл бұрын
Would like to thank Rasa for amazing work on explaining the concepts. Truly exceptional
@WahranRai
@WahranRai 2 жыл бұрын
10:01 I think it is - log(Cij) instead of + log(Cij) (similar to square root error)
@duongphanai7094
@duongphanai7094 Жыл бұрын
this video saved my life! Good job! Keep it up!
@firstnamelastname3106
@firstnamelastname3106 3 жыл бұрын
bruh, you somehow managed to explain it better than the dude from stenford, thanks !
@amirhosseinramazani757
@amirhosseinramazani757 2 жыл бұрын
I really enjoyed it! Thanks!
@kajumilylys2617
@kajumilylys2617 3 жыл бұрын
Thank you for this series it's been super helpful ❤️
@SreeramAjay
@SreeramAjay 4 жыл бұрын
at 9:56 , should that be (minus) -log(cij) ?
@masoudparpanchi505
@masoudparpanchi505 4 жыл бұрын
🤔🤔
@kajumilylys2617
@kajumilylys2617 3 жыл бұрын
I would like to know why do the weights change every time we retrain the embedding layer!?
@AttiDavidson
@AttiDavidson 4 жыл бұрын
Thank you very much! Very good explanation.
@PieroSavastano
@PieroSavastano 4 жыл бұрын
Beautiful!
@kasraamanat5453
@kasraamanat5453 2 жыл бұрын
that was so good, thank u❤
@abdulrahmanmohamed2824
@abdulrahmanmohamed2824 4 жыл бұрын
Great, explnation! but what about if there another one like this for BERT?
@RasaHQ
@RasaHQ 4 жыл бұрын
A few videos later in the series are about transformers/attention which are the core components of BERT.
@anupammitra
@anupammitra 3 жыл бұрын
What are the publicly available word embedding available for use apart from GloVe
@RasaHQ
@RasaHQ 3 жыл бұрын
Hi Anupam! There are a lot of different word embeddings available. Some of the most common ones are Glove, Fasttext and word2vec. Contextual embeddings are also increasingly popular, including Elmo and BERT. (All of these are for English; embeddings are also available in other languages.)
@anupammitra
@anupammitra 3 жыл бұрын
@@RasaHQ Thank you for your response
@a7d2e5aff
@a7d2e5aff 8 ай бұрын
Merci!
@masoudparpanchi505
@masoudparpanchi505 4 жыл бұрын
well explained
1% vs 100% #beatbox #tiktok
01:10
BeatboxJCOP
Рет қаралды 67 МЛН
So Cute 🥰 who is better?
00:15
dednahype
Рет қаралды 19 МЛН
Word Embedding and Word2Vec, Clearly Explained!!!
16:12
StatQuest with Josh Starmer
Рет қаралды 353 М.
What are Word Embeddings?
8:38
IBM Technology
Рет қаралды 23 М.
Rasa Algorithm Whiteboard - BytePair Embeddings
12:45
Transformers (how LLMs work) explained visually | DL5
27:14
3Blue1Brown
Рет қаралды 4,1 МЛН
Rasa Algorithm Whiteboard - StarSpace
11:47
Rasa
Рет қаралды 8 М.
Vectoring Words (Word Embeddings) - Computerphile
16:56
Computerphile
Рет қаралды 299 М.