Rasa Algorithm Whiteboard - Understanding Word Embeddings 3: GloVe

  Рет қаралды 9,945

Rasa

Rasa

Күн бұрын

Пікірлер: 21
@faangsde
@faangsde 4 жыл бұрын
One of the best channels/videos regarding NLP. You strike the balance between video length and in-depth explanation of every topic. Also, the video is easy to follow and instructive.
@r5bc
@r5bc 4 жыл бұрын
I saw the master class play list and i saw all the videos in this this one too. And i want to thank you for the amazing work, the extremely big value you provide to us and the light you shed on the topic . I can't wait to see the next video. Please keep up the good work 👍👍👍👍
@anupammitra
@anupammitra 3 жыл бұрын
Would like to thank Rasa for amazing work on explaining the concepts. Truly exceptional
@WahranRai
@WahranRai 2 жыл бұрын
10:01 I think it is - log(Cij) instead of + log(Cij) (similar to square root error)
@firstnamelastname3106
@firstnamelastname3106 3 жыл бұрын
bruh, you somehow managed to explain it better than the dude from stenford, thanks !
@duongphanai7094
@duongphanai7094 Жыл бұрын
this video saved my life! Good job! Keep it up!
@SreeramAjay
@SreeramAjay 4 жыл бұрын
at 9:56 , should that be (minus) -log(cij) ?
@masoudparpanchi505
@masoudparpanchi505 4 жыл бұрын
🤔🤔
@kajumilylys2617
@kajumilylys2617 3 жыл бұрын
Thank you for this series it's been super helpful ❤️
@kajumilylys2617
@kajumilylys2617 3 жыл бұрын
I would like to know why do the weights change every time we retrain the embedding layer!?
@AttiDavidson
@AttiDavidson 4 жыл бұрын
Thank you very much! Very good explanation.
@amirhosseinramazani757
@amirhosseinramazani757 2 жыл бұрын
I really enjoyed it! Thanks!
@abdulrahmanmohamed2824
@abdulrahmanmohamed2824 4 жыл бұрын
Great, explnation! but what about if there another one like this for BERT?
@RasaHQ
@RasaHQ 4 жыл бұрын
A few videos later in the series are about transformers/attention which are the core components of BERT.
@anupammitra
@anupammitra 3 жыл бұрын
What are the publicly available word embedding available for use apart from GloVe
@RasaHQ
@RasaHQ 3 жыл бұрын
Hi Anupam! There are a lot of different word embeddings available. Some of the most common ones are Glove, Fasttext and word2vec. Contextual embeddings are also increasingly popular, including Elmo and BERT. (All of these are for English; embeddings are also available in other languages.)
@anupammitra
@anupammitra 3 жыл бұрын
@@RasaHQ Thank you for your response
@kasraamanat5453
@kasraamanat5453 2 жыл бұрын
that was so good, thank u❤
@PieroSavastano
@PieroSavastano 4 жыл бұрын
Beautiful!
@a7d2e5aff
@a7d2e5aff 8 ай бұрын
Merci!
@masoudparpanchi505
@masoudparpanchi505 4 жыл бұрын
well explained
黑天使被操控了#short #angel #clown
00:40
Super Beauty team
Рет қаралды 61 МЛН
Beat Ronaldo, Win $1,000,000
22:45
MrBeast
Рет қаралды 158 МЛН
Cat mode and a glass of water #family #humor #fun
00:22
Kotiki_Z
Рет қаралды 42 МЛН
What are Word Embeddings?
8:38
IBM Technology
Рет қаралды 23 М.
Deep Learning(CS7015): Lec 10.8 GloVe representations
7:59
NPTEL-NOC IITM
Рет қаралды 19 М.
Rasa Algorithm Whiteboard - BytePair Embeddings
12:45
Word Embedding and Word2Vec, Clearly Explained!!!
16:12
StatQuest with Josh Starmer
Рет қаралды 352 М.
Transformers (how LLMs work) explained visually | DL5
27:14
3Blue1Brown
Рет қаралды 4 МЛН
Vectoring Words (Word Embeddings) - Computerphile
16:56
Computerphile
Рет қаралды 299 М.
The Midpoint Circle Algorithm Explained Step by Step
13:33
NoBS Code
Рет қаралды 168 М.
Lecture 3 | GloVe: Global Vectors for Word Representation
1:18:40
Stanford University School of Engineering
Рет қаралды 223 М.
黑天使被操控了#short #angel #clown
00:40
Super Beauty team
Рет қаралды 61 МЛН