Mighty New TransformerFAM (Feedback Attention Mem)

  Рет қаралды 1,822

Discover AI

Discover AI

Күн бұрын

Пікірлер: 5
@kimchi_taco
@kimchi_taco 8 ай бұрын
I'm so grateful for featuring my paper! I was incredibly impressed by how thoroughly you explained the paper I uploaded to arXiv this week. It was especially meaningful to me as a fellow physics major. Thank you for your continued dedication to your work.
@code4AI
@code4AI 8 ай бұрын
Of course it was a fellow physicist who developed such a beautiful theory! Really enjoyed the pre-print and I am looking forward to your next ideas on transformers (would appreciate a short notice here on the channel that I do not miss your next work). Congrats to you and the team.
@jflu1
@jflu1 8 ай бұрын
Literally the best channel on all of KZbin. Love your content, a green grasshopper.
@AnshumanKumar007
@AnshumanKumar007 8 ай бұрын
This channel is a goldmine.
@MegaClockworkDoc
@MegaClockworkDoc 8 ай бұрын
Wonderful work professor ❤
NEW: INFINI Attention w/ 1 Mio Context Length
21:23
Discover AI
Рет қаралды 2,1 М.
Visualizing transformers and attention | Talk for TNG Big Tech Day '24
57:45
Как Ходили родители в ШКОЛУ!
0:49
Family Box
Рет қаралды 2,3 МЛН
Why Does Diffusion Work Better than Auto-Regression?
20:18
Algorithmic Simplicity
Рет қаралды 422 М.
TransformerFAM: Feedback attention is working memory
37:01
Yannic Kilcher
Рет қаралды 38 М.
Transformers (how LLMs work) explained visually | DL5
27:14
3Blue1Brown
Рет қаралды 4,3 МЛН
GraphRAG: The Marriage of Knowledge Graphs and RAG: Emil Eifrem
19:15
PydanticAI - Building a Research Agent
17:34
Sam Witteveen
Рет қаралды 21 М.
Как Ходили родители в ШКОЛУ!
0:49
Family Box
Рет қаралды 2,3 МЛН