Graph Language Models EXPLAINED in 5 Minutes! [Author explanation 🔴 at ACL 2024]

  Рет қаралды 4,696

AI Coffee Break with Letitia

AI Coffee Break with Letitia

Күн бұрын

Пікірлер: 11
@jmirodg7094
@jmirodg7094 Ай бұрын
Excellent! need to go deeper that could be a game changer for reasoning, as it makes more sense to reason on a graph rather than on the next token.
@bensimonjoules4402
@bensimonjoules4402 Ай бұрын
Its interesting to see "attention" on graph structures again. I think in the future a more structured knowledge representation may play a role on improving reasoning, as we could leverage logic and rules using engines on them, like compilers aid in code generation.
@bharanij6130
@bharanij6130 Ай бұрын
Thank you for this video Letitia! As always amazing :=) Side note: Loved the silent Good Bye :)))
@jonclement
@jonclement Ай бұрын
Interesting. It's almost like two types of tokens: nodes + edges which can each be compressed to a feature vector. But yes, with positional encoding you're left with "random walk with restart" or a traversal depth. Or one could sum node_vector + edge_vector ~= positional distance. but yeah, more graph solutions coming in the future.
@vladimirtchuiev2218
@vladimirtchuiev2218 Ай бұрын
I'm interested even more on the generative side, generating large graphs with contained text in them from a prompt, can be useful for modalities which are represented by large graphs. I've yet to see anyone doing this. While you can prompt LLMs to generate small graphs, for larger graphs you see significant performance drops.
@sonOfLiberty100
@sonOfLiberty100 Ай бұрын
It would be interesting how much computation this needs
@AICoffeeBreak
@AICoffeeBreak Ай бұрын
Do you mean for training or inference? Training is a finetuning setting and you can see performance curves in Figure 4 in the paper. arxiv.org/pdf/2401.07105 Inference costs as much as the base LLM.
@sonOfLiberty100
@sonOfLiberty100 Ай бұрын
@@AICoffeeBreak both things. overall computation. Thank you, I will take a look
@MoritzPlenz
@MoritzPlenz Ай бұрын
Hi, I am Moritz (one of the authors). I don't have much to add to Letitia's reply, but here is another relevant part of the paper, taken from section 4: Being transformers, GLMs have the same computational complexity as their respective LM. For sparse graphs the lGLM could make use of sparse matrix multiplication, making it more efficient than a corresponding LM or gGLM. However, for our experiments this was not necessary.
@yorailevi6747
@yorailevi6747 Ай бұрын
I need to read this more deeply, I don't understand why would just grafting the parameters willy nilly works
@keeperofthelight9681
@keeperofthelight9681 Ай бұрын
Deep Learning is more of an alchemy than anything, an in-depth thought out plan may not work and sometimes just a hacky way around a solution works a lot better
CUDA Mode Keynote | Andrej Karpathy | Eureka Labs
23:21
Accel
Рет қаралды 15 М.
Do you love Blackpink?🖤🩷
00:23
Karina
Рет қаралды 14 МЛН
Муж внезапно вернулся домой @Oscar_elteacher
00:43
История одного вокалиста
Рет қаралды 6 МЛН
1, 2, 3, 4, 5, 6, 7, 8, 9 🙈⚽️
00:46
Celine Dept
Рет қаралды 113 МЛН
LLM hallucinations discover new math solutions!? | FunSearch explained
11:36
AI Coffee Break with Letitia
Рет қаралды 12 М.
Mission: Impossible language models - Paper Explained [ACL 2024 recording]
11:05
AI Coffee Break with Letitia
Рет қаралды 8 М.
Visualizing transformers and attention | Talk for TNG Big Tech Day '24
57:45
MAMBA and State Space Models explained | SSM explained
22:27
AI Coffee Break with Letitia
Рет қаралды 53 М.
Learning at test time in LLMs
51:02
Machine Learning Street Talk
Рет қаралды 20 М.
Is Tree-based RAG Struggling? Not with Knowledge Graphs!
9:06
ML Was Hard Until I Learned These 5 Secrets!
13:11
Boris Meinardus
Рет қаралды 340 М.
The moment we stopped understanding AI [AlexNet]
17:38
Welch Labs
Рет қаралды 1,3 МЛН
Transformers explained | The architecture behind LLMs
19:48
AI Coffee Break with Letitia
Рет қаралды 27 М.
iPhone 📱 17 Pro Max Unreal Zoom 🤯
0:11
DrBoom
Рет қаралды 1,1 МЛН
Как подключить магнитолу?
0:51
KS Customs
Рет қаралды 2,2 МЛН
ИГРОВОЙ ПК от ИЛОНА МАСКА, Распаковка
32:50