31 - Singular Learning Theory with Daniel Murfet

  Рет қаралды 404

AXRP

AXRP

17 күн бұрын

What's going on with deep learning? What sorts of models get learned, and what are the learning dynamics? Singular learning theory is a theory of Bayesian statistics broad enough in scope to encompass deep neural networks that may help answer these questions. In this episode, I speak with Daniel Murfet about this research program and what it tells us.
Patreon: patreon.com/axrpodcast
Ko-fi: ko-fi.com/axrpodcast
Topics we discuss, and timestamps:
0:00:26 - What is singular learning theory?
0:16:00 - Phase transitions
0:35:12 - Estimating the local learning coefficient
0:44:37 - Singular learning theory and generalization
1:00:39 - Singular learning theory vs other deep learning theory
1:17:06 - How singular learning theory hit AI alignment
1:33:12 - Payoffs of singular learning theory for AI alignment
1:59:36 - Does singular learning theory advance AI capabilities?
2:13:02 - Open problems in singular learning theory for AI alignment
2:20:53 - What is the singular fluctuation?
2:25:33 - How geometry relates to information
2:30:13 - Following Daniel Murfet's work
The transcript: axrp.net/episode/2024/05/07/e...
Daniel Murfet's twitter/X account: / danielmurfet
Developmental interpretability website: devinterp.com
Developmental interpretability KZbin channel: / @devinterp
Main research discussed in this episode:
- Developmental Landscape of In-Context Learning: arxiv.org/abs/2402.02364
- Estimating the Local Learning Coefficient at Scale: arxiv.org/abs/2402.03698
- Simple versus Short: Higher-order degeneracy and error-correction: www.lesswrong.com/posts/nWRj6...
Other links:
- Algebraic Geometry and Statistical Learning Theory (the grey book): www.cambridge.org/core/books/...
- Mathematical Theory of Bayesian Statistics (the green book): www.routledge.com/Mathematica...
- In-context learning and induction heads: transformer-circuits.pub/2022...
- Saddle-to-Saddle Dynamics in Deep Linear Networks: Small Initialization Training, Symmetry, and Sparsity: arxiv.org/abs/2106.15933
- A mathematical theory of semantic development in deep neural networks: www.pnas.org/doi/abs/10.1073/...
- Consideration on the Learning Efficiency Of Multiple-Layered Neural Networks with Linear Units: papers.ssrn.com/sol3/papers.c...
- Neural Tangent Kernel: Convergence and Generalization in Neural Networks: arxiv.org/abs/1806.07572
- The Interpolating Information Criterion for Overparameterized Models: arxiv.org/abs/2307.07785
- Feature Learning in Infinite-Width Neural Networks: arxiv.org/abs/2011.14522
- A central AI alignment problem: capabilities generalization, and the sharp left turn: www.lesswrong.com/posts/GNhMP...
- Quantifying degeneracy in singular models via the learning coefficient: arxiv.org/abs/2308.12108

Пікірлер: 2
@dizietz
@dizietz 10 күн бұрын
This one was pretty technical for those of us that haven't read some of the foundational work for SLT. I had to stop and look up some specific details later, and still don't feel like I fully grasp what makes SLT different than other predictions about degeneracy and simple functions preference in terms of making predictions about nn behavior. David's framing of fundamental structures in the data being more important across any training runs makes a lot of sense, I still don't grok how this helps with alignment. I suppose understanding stability of structure moves us closer, but both on something similar to interoperability but also on capabilities.
@nowithinkyouknowyourewrong8675
@nowithinkyouknowyourewrong8675 13 күн бұрын
10ins in and I still don't get why it's interesting? like it's a math stats tool that he is still making that will enable us to do other things
The Most Important Algorithm in Machine Learning
40:08
Artem Kirsanov
Рет қаралды 203 М.
Transformer Neural Networks, ChatGPT's foundation, Clearly Explained!!!
36:15
StatQuest with Josh Starmer
Рет қаралды 560 М.
How To Choose Ramen Date Night 🍜
00:58
Jojo Sim
Рет қаралды 60 МЛН
Como ela fez isso? 😲
00:12
Los Wagners
Рет қаралды 11 МЛН
OMG 😨 Era o tênis dela 🤬
00:19
Polar em português
Рет қаралды 3,7 МЛН
Gaussian Processes
23:47
Mutual Information
Рет қаралды 113 М.
Prof. Chris Bishop's NEW Deep Learning Textbook!
1:23:00
Machine Learning Street Talk
Рет қаралды 77 М.
Interpretable vs Explainable Machine Learning
7:07
A Data Odyssey
Рет қаралды 14 М.
The Essential Main Ideas of Neural Networks
18:54
StatQuest with Josh Starmer
Рет қаралды 863 М.
'How neural networks learn' - Part III: Generalization and Overfitting
22:35
MIT Introduction to Deep Learning | 6.S191
1:09:58
Alexander Amini
Рет қаралды 162 М.
Decoder-Only Transformers, ChatGPTs specific Transformer, Clearly Explained!!!
36:45
StatQuest with Josh Starmer
Рет қаралды 94 М.
GEOMETRIC DEEP LEARNING BLUEPRINT
3:33:23
Machine Learning Street Talk
Рет қаралды 163 М.
Теперь это его телефон
0:21
Хорошие Новости
Рет қаралды 1,6 МЛН
Apple, как вас уделал Тюменский бренд CaseGuru? Конец удивил #caseguru #кейсгуру #наушники
0:54
CaseGuru / Наушники / Пылесосы / Смарт-часы /
Рет қаралды 4,1 МЛН
3D printed Nintendo Switch Game Carousel
0:14
Bambu Lab
Рет қаралды 3,9 МЛН
НЕ ПОКУПАЙ iPad Pro
13:46
itpedia
Рет қаралды 399 М.
Vortex Cannon vs Drone
20:44
Mark Rober
Рет қаралды 14 МЛН