Can I Turn A 158 Million Parameter LLM Into A GPT01 Level Model?

  Рет қаралды 29

Richard Aragon

Richard Aragon

Күн бұрын

Пікірлер: 2
@manuelgrama3000
@manuelgrama3000 3 сағат бұрын
Why not use something more useful like 0.5B model at least, or 1B. 1B quantized can still run on a phone without issues.
@richardaragon8471
@richardaragon8471 3 сағат бұрын
What is not useful about it? It works! I think it is amazing. I love these models because you can test anything with them. If you find something you like, then try it on a bigger model.
Calling Out AI Hype In Research: Are RNNs All We Needed?
12:32
Richard Aragon
Рет қаралды 91
How Strong is Tin Foil? 💪
00:25
Brianna
Рет қаралды 46 МЛН
Это было очень близко...
00:10
Аришнев
Рет қаралды 7 МЛН
Osman Kalyoncu Sonu Üzücü Saddest Videos Dream Engine 269 #shorts
00:26
Scaling Vector Database Usage Without Breaking the Bank   Quantization and Adaptive Retrieval
1:11:24
Toronto Machine Learning Series (TMLS)
Рет қаралды 136
Reinforcement Learning vs Supervised and Unsupervised Learning
19:53
Should I Become a Lawyer? (the honest truth)
14:00
Brittany Tran
Рет қаралды 647 М.
Apprenticeship Interview
9:55
SkillsJersey
Рет қаралды 546 М.
Did Pete Alonso Turn Down a $158 Million Contract Extension?
24:45
Locked On Mets
Рет қаралды 5 М.
Making Minecraft 100x faster (by rewriting it in Rust)
17:02
Theo - t3․gg
Рет қаралды 26 М.
Stay Secure: Cybersecurity Tips for Nonprofits
7:17
Donorbox
Рет қаралды 173
How Strong is Tin Foil? 💪
00:25
Brianna
Рет қаралды 46 МЛН