OpenAI Embeddings Explained in 5 Minutes

  Рет қаралды 14,465

Cooper Codes

Cooper Codes

Күн бұрын

Пікірлер: 15
@enzogireaud5244
@enzogireaud5244 Жыл бұрын
Can't believe your channel is that much underrated tho, you are doing a really good content, keep it up !
@CooperCodes
@CooperCodes Жыл бұрын
I can't believe I missed this! Thank you so much for your support it means a lot :)
@Anton_Sh.
@Anton_Sh. 9 ай бұрын
What is the source for this cool 3D embeddings viz at 1:04 ?
@munkuo5
@munkuo5 10 ай бұрын
What is the font you are using? Love it.
@Leo_Oxford
@Leo_Oxford Ай бұрын
Loved the concise explanation! Thank you
@Pritex2121
@Pritex2121 10 ай бұрын
Whats the tool used for the schemas ?
@willyjauregui6541
@willyjauregui6541 5 ай бұрын
Thanks for the explanation. When storing embeddings, how does the system determine which phrase or words are similar to each other ? Does it assign weights according to some previous train or knowledge? Also, in the VectorDB , do we have the text associated with embedding or it's just the arrays? If so, the system needs to convert it to text again when retreiving the data?
@itskittyme
@itskittyme 6 ай бұрын
2:50 I don't understand how text is grouped. Who decides or what decides they are grouped by the fact they are athletes and not their nationality? Or why is everything grouped together that says "Cooper" and why isn't it grouped together with all youtubers, humans, or programmers?
@ImNotActuallyChristian
@ImNotActuallyChristian 4 ай бұрын
It's a bit of a simplified model. What's not really shown here is that it's not embedded in 3 dimensions, but with thousands of dimensions. Some concepts may be close to each other in some dimensions, but far away from each other in other dimensions. Again, even this is a very simplified mental model.
@itskittyme
@itskittyme 4 ай бұрын
@@ImNotActuallyChristian and so we are searching in all dimensions at the same time?
@MageDigest-c1z
@MageDigest-c1z 21 сағат бұрын
it's the LLM embedding technique that does all the magic in the background and store in the vector space with n dimensions based on the semantic nature of the word/sentence/docs
@russelnormandia2876
@russelnormandia2876 Жыл бұрын
thanks for the concise explanation
@Username23134
@Username23134 10 ай бұрын
Nice video!
@parkerrex
@parkerrex Жыл бұрын
good explainer
@mindfulnessrock
@mindfulnessrock 2 ай бұрын
I still don't quite get what those embedding are. Are they some kind of base64 encoding of data to provide context for AI models, or just a reference to my data stored in the remote/local AI model's memory?
OpenAI Embeddings and Vector Databases Crash Course
18:41
Adrian Twarog
Рет қаралды 516 М.
What are Word Embeddings?
8:38
IBM Technology
Рет қаралды 23 М.
How to have fun with a child 🤣 Food wrap frame! #shorts
0:21
BadaBOOM!
Рет қаралды 17 МЛН
Transformers (how LLMs work) explained visually | DL5
27:14
3Blue1Brown
Рет қаралды 4,1 МЛН
Master OpenAI EMBEDDING API (super simple!)
17:35
Code with Ania Kubów
Рет қаралды 7 М.
Vectoring Words (Word Embeddings) - Computerphile
16:56
Computerphile
Рет қаралды 299 М.
Build Anything with AI Agents, Here's How
29:49
David Ondrej
Рет қаралды 329 М.
Python RAG Tutorial (with Local LLMs): AI For Your PDFs
21:33
pixegami
Рет қаралды 330 М.
A Complete Overview of Word Embeddings
17:17
AssemblyAI
Рет қаралды 114 М.
LangChain (OpenAI) Vector Embeddings For Beginners
14:27
Ryan & Matt Data Science
Рет қаралды 4,7 М.
LangChain Explained in 13 Minutes | QuickStart Tutorial for Beginners
12:44