Embeddings - EXPLAINED!

  Рет қаралды 9,772

CodeEmporium

CodeEmporium

Күн бұрын

Пікірлер: 24
@bakyt_yrysov
@bakyt_yrysov 12 күн бұрын
Love the format! The quizzes, visualizations and summary is awesome. Thank you!
@contactdi8426
@contactdi8426 Жыл бұрын
Thanks a lot Ajay for such amazing informational content. Just STOP saying the awkward QUIZZZ time.. whole focus /mood goes away
@justinwhite2725
@justinwhite2725 11 ай бұрын
It's annoying, but it also gives a break between sections that would otherwise blue together.
@walterbaltzley4546
@walterbaltzley4546 11 ай бұрын
I agree - I find the particular tone and pitch he uses when saying that to be painful (it literally hurts my ears). The transition from learning to review is a good idea; the execution can be improved.
@mehdicharife2335
@mehdicharife2335 5 ай бұрын
I don't think that the issue is computers only understanding numbers. Even numbers are not directly understood by computers and need to be represented via combinations of "1"s and "0"s. Perhaps the issue is more related to the fact that neural networks and similar models can't directly deal with non-numerical data, and hence the need for a numerical representation before any training can take place. You make a great point about the higher computational cost of using natural or default numerical representations for items like words and images, which explains the need for an 'embedded' representation.
@youngsci
@youngsci 11 ай бұрын
I very much enjoyed every video you made till now. Your explanation has always been extraordinary, but please stop saying "Quiz Time" 😂😂
@markchen8893
@markchen8893 6 ай бұрын
Great video! Thank you so much! It makes things easier for someone who just started learning ML.
@punk3900
@punk3900 10 ай бұрын
Genius presentation! Thanks! Keep up your excellent work!
@yolemmein
@yolemmein 8 ай бұрын
Very useful and great explanation! Thank you so much!
@EobardUchihaThawne
@EobardUchihaThawne Жыл бұрын
my method to learn new words in vocab is actually train the pretrained model using transformers
@justinwhite2725
@justinwhite2725 11 ай бұрын
5:06 techically both B and C are correct here. I guess i would say C is the primary and B is a nice (but necessary) sode benefit.
@F30-Jet
@F30-Jet 2 ай бұрын
You failed sir! There is no technically. The answer is C. Understand the question t...technically.
@sagardesai1253
@sagardesai1253 Жыл бұрын
Thanks for video, you explain things in different difficulty level, that works. The quize and stuff is not working, for me breaks the flow of the content.
@MannyBernabe
@MannyBernabe 11 ай бұрын
Fun! Thank you!
@justchary
@justchary 9 ай бұрын
this is very good. thank you!
@slitihela1860
@slitihela1860 11 ай бұрын
can you prepare a video for Double Q-Learning Network and Dueling Double Q-Learning Network please
@sharjeel_mazhar
@sharjeel_mazhar 9 ай бұрын
Can you please make a video that showcases how we can generate custom word embedding on a custom dataset from scratch? Without using anything pre-built? Say IMDb dataset? and then later load them to train a classification model?
@katzenschildkroete
@katzenschildkroete Жыл бұрын
C, B, A
@CodeEmporium
@CodeEmporium Жыл бұрын
Ding ding ding! I agree with your answers!
@x_avimimius3294
@x_avimimius3294 Жыл бұрын
Hi I have an id about an ai based podcast . Here I want to create ai as the main frame of the podcast . Can you guide me on this ?
@AnA-xx1vx
@AnA-xx1vx Жыл бұрын
4:30 What was this??😂
@CodeEmporium
@CodeEmporium Жыл бұрын
Why it’s everyone’s favorite time Quiiiiiz Timmmmmmee, of course!
@AnA-xx1vx
@AnA-xx1vx Жыл бұрын
@@CodeEmporium yen Anna
@sudarshanseshadri2144
@sudarshanseshadri2144 4 ай бұрын
C, B, A
Transfer Learning - EXPLAINED!
16:22
CodeEmporium
Рет қаралды 6 М.
RAG - Explained!
30:00
CodeEmporium
Рет қаралды 3,9 М.
小丑女COCO的审判。#天使 #小丑 #超人不会飞
00:53
超人不会飞
Рет қаралды 16 МЛН
The Best Band 😅 #toshleh #viralshort
00:11
Toshleh
Рет қаралды 22 МЛН
The moment we stopped understanding AI [AlexNet]
17:38
Welch Labs
Рет қаралды 1,6 МЛН
Vectoring Words (Word Embeddings) - Computerphile
16:56
Computerphile
Рет қаралды 304 М.
Visualizing transformers and attention | Talk for TNG Big Tech Day '24
57:45
How might LLMs store facts | DL7
22:43
3Blue1Brown
Рет қаралды 1 МЛН
Informer embeddings - EXPLAINED!
24:59
CodeEmporium
Рет қаралды 2,1 М.
Convolutional Neural Networks from Scratch | In Depth
12:56
Transformer Neural Networks - EXPLAINED! (Attention is all you need)
13:05
Why Does Diffusion Work Better than Auto-Regression?
20:18
Algorithmic Simplicity
Рет қаралды 452 М.
Transformers (how LLMs work) explained visually | DL5
27:14
3Blue1Brown
Рет қаралды 4,7 МЛН
小丑女COCO的审判。#天使 #小丑 #超人不会飞
00:53
超人不会飞
Рет қаралды 16 МЛН