The State Space Model Revolution, with Albert Gu

  Рет қаралды 1,600

Cognitive Revolution "How AI Changes Everything"

Cognitive Revolution "How AI Changes Everything"

Күн бұрын

Nathan hosts Albert Gu, assistant professor at CMU and co-founder of Cartesia AI, to discuss the groundbreaking Mamba architecture. In this episode of The Cognitive Revolution, we explore the state space model revolution, diving into the technical details of Mamba and Mamba 2. Join us for an insightful conversation on the future of AI architectures and their potential to transform the field.
Apply to join over 400 founders and execs in the Turpentine Network: hmplogxqz0y.typeform.com/to/J...
RECOMMENDED PODCAST:
Byrne Hobart, the writer of The Diff, is revered in Silicon Valley. You can get an hour with him each week. See for yourself how his thinking can upgrade yours.
Spotify: open.spotify.com/show/6rANlV5...
Apple: podcasts.apple.com/us/podcast...
SPONSORS:
Oracle Cloud Infrastructure (OCI) is a single platform for your infrastructure, database, application development, and AI needs. OCI has four to eight times the bandwidth of other clouds; offers one consistent price, and nobody does data better than Oracle. If you want to do more and spend less, take a free test drive of OCI at oracle.com/cognitive
The Brave search API can be used to assemble a data set to train your AI models and help with retrieval augmentation at the time of inference. All while remaining affordable with developer first pricing, integrating the Brave search API into your workflow translates to more ethical data sourcing and more human representative data sets. Try the Brave search API for free for up to 2000 queries per month at bit.ly/BraveTCR
Omneky is an omnichannel creative generation platform that lets you launch hundreds of thousands of ad iterations that actually work customized across all platforms, with a click of a button. Omneky combines generative AI and real-time advertising data. Mention "Cog Rev" for 10% off www.omneky.com/
Head to Squad to access global engineering without the headache and at a fraction of the cost: head to choosesquad.com/ and mention “Turpentine” to skip the waitlist.
CHAPTERS:
(00:00:00) About the Show
(00:05:39) State Space Models
(00:13:05) Intuition and inspiration
(00:18:27) Surprises
(00:22:33) Sponsors: Oracle | Brave
(00:24:41) Biological inspiration
(00:25:19) MAMBA breakthrough
(00:30:59) How does the state work?
(00:36:44) What is the size of the state?
(00:39:05) Training vs. Inference (Part 1)
(00:42:04) Sponsors: Omneky | Squad
(00:43:51) Training vs. Inference (Part 2)
(00:43:51) Sequence Models
(00:49:20) Mamba inference
(00:57:53) Mamba2 vs Mamba1
(01:16:05) Overtraining and the future of SSMs
(01:17:44) Training efficiency vs inference efficiency
(01:20:52) Hybrid models
(01:25:04) Scaling Attention Layers
(01:30:23) Optimizing State
(01:34:09) The extrapolation abilities of the SSMs
(01:36:37) Sequence parallelism with Mamba 2
(01:39:20) Why are you publishing all this?
(01:40:46) Cartesia and Together
(01:41:54) Outro

Пікірлер: 8
@wwkk4964
@wwkk4964 28 күн бұрын
Albert is the man! Thank you!
@mkamp
@mkamp 18 күн бұрын
That’s an awesome episode. Very high information density. I am constantly rewinding to hear again the exact framing of questions and answers.
@augmentos
@augmentos 26 күн бұрын
That was a 2time watch
@charliesteiner2334
@charliesteiner2334 28 күн бұрын
This one was a good'un.
@CognitiveRevolutionPodcast
@CognitiveRevolutionPodcast 28 күн бұрын
Thanks. Mamba🐍😄!
@mkamp
@mkamp 18 күн бұрын
1:15 Albert says that doubling the state size in Mamba 1 doubles the wall clock time. Also, that in Mamba 2 much of the computation is not contingent on the state size. Why the latter? Computation time is constant because it’s one matmul happening in parallel as one step on the GPU?
@mkamp
@mkamp 18 күн бұрын
When Albert says, multiple times, that they avoid to materialize the state, it sounds that they don’t materialize the state at all during the forward pass in training. Does he mean that exactly? Or that they avoid to materialize the full state at once, but materialize the whole state incrementally, chunk by chunk?
Long-Term Memory for LLMs, with HippoRAG author Bernal Jiménez Gutierrez
1:21:53
Cognitive Revolution "How AI Changes Everything"
Рет қаралды 1,6 М.
The moment we stopped understanding AI [AlexNet]
17:38
Welch Labs
Рет қаралды 823 М.
Inside Out Babies (Inside Out Animation)
00:21
FASH
Рет қаралды 18 МЛН
Sigma girl and soap bubbles by Secret Vlog
00:37
Secret Vlog
Рет қаралды 14 МЛН
Amazing weight loss transformation !! 😱😱
00:24
Tibo InShape
Рет қаралды 62 МЛН
Building an Intelligent Business OS, with Runway CEO Siqi Chen
1:12:42
Cognitive Revolution "How AI Changes Everything"
Рет қаралды 1 М.
The Man Who Solved the World’s Most Famous Math Problem
11:14
Newsthink
Рет қаралды 753 М.
Has Generative AI Already Peaked? - Computerphile
12:48
Computerphile
Рет қаралды 911 М.
Official PyTorch Documentary: Powering the AI Revolution
35:53
Bill Gates Reveals Superhuman AI Prediction
57:18
Next Big Idea Club
Рет қаралды 201 М.
What's the future for generative AI? - The Turing Lectures with Mike Wooldridge
1:00:59
MAMBA from Scratch: Neural Nets Better and Faster than Transformers
31:51
Algorithmic Simplicity
Рет қаралды 157 М.
Это Xiaomi Su7 Max 🤯 #xiaomi #su7max
1:01
Tynalieff Shorts
Рет қаралды 2,1 МЛН
Лучший браузер!
0:27
Honey Montana
Рет қаралды 803 М.
КРУТОЙ ТЕЛЕФОН
0:16
KINO KAIF
Рет қаралды 6 МЛН