Scaling PyTorch Model Training With Minimal Code Changes

  Рет қаралды 3,893

Sebastian Raschka

Sebastian Raschka

Күн бұрын

Пікірлер: 13
@stanislawcronberg3271
@stanislawcronberg3271 Жыл бұрын
Love the straightforward video, didn't know about Fabric for quickly upgrading existing PyTorch code
@user-wr4yl7tx3w
@user-wr4yl7tx3w Жыл бұрын
Wow, great presentation.
@SebastianRaschka
@SebastianRaschka Жыл бұрын
Thanks :)
@dinabandhub
@dinabandhub Жыл бұрын
Great tutorial. 🎉
@SebastianRaschka
@SebastianRaschka Жыл бұрын
Thank you! 😊
@user-wr4yl7tx3w
@user-wr4yl7tx3w Жыл бұрын
this is really awesome content.
@nguyenhuuuc2311
@nguyenhuuuc2311 Жыл бұрын
Thanks so much for the tutorial. I learned a lot from you!!!! I have a question: What modifications should be made to the code fabric.setup(model, optimizer) if I use a learning rate scheduler?
@nguyenhuuuc2311
@nguyenhuuuc2311 Жыл бұрын
And just personal feedback for an awesome tutorial It would be great if you could consider including a gentle reminder that running code on multiple GPUs often requires the use of scripts rather than executing them directly in a notebook. Sorry if I missed any mention of this information already being included in the tutorial.
@SebastianRaschka
@SebastianRaschka Жыл бұрын
thanks, and great question. Since normal schedulers don't have any parameters you can use it as usual (no need to put it into fabric.setup). But using fabric.setup also doesn't hurt. I added a quick example here: github.com/rasbt/cvpr2023/blob/main/07_fabric-vit-mixed-fsdp-with-scheduler.py
@SebastianRaschka
@SebastianRaschka Жыл бұрын
@@nguyenhuuuc2311 Good point. Yeah, notebook (or interactive) environments are generally incompatible with multi-GPU training due to their multiprocessing limitations. Hah, I take it for granted these days but definitely a good time to mention that as a reminder!
@nguyenhuuuc2311
@nguyenhuuuc2311 Жыл бұрын
@@SebastianRaschka Thanks for spending time on my question and the quick answer with a notebook ❤
@hamzawi2752
@hamzawi2752 Жыл бұрын
Thank you so much. Impressive presentation! Do you think it is worth learning lightning, I am a PhD student and I am comfortable with Pytorch. Does lightning have all capabilities like Pytorch? I know that lightening to Pytorch like keras to Tensorflow
@SebastianRaschka
@SebastianRaschka Жыл бұрын
Good question @hamzawi2752. Fabric (covered in this video) is basically an add-on to PyTorch. It's basically useful for tapping into more advanced features like multi-GPU training, mixed-precision training etc. with minimal code changes. It's essentially a wrapper around PyTorch features, but doing this in pure PyTorch is definitely more work. So, I'd say it's worth it.
Finetuning Open-Source LLMs
20:05
Sebastian Raschka
Рет қаралды 35 М.
UFC 287 : Перейра VS Адесанья 2
6:02
Setanta Sports UFC
Рет қаралды 486 М.
PyTorch or Tensorflow? Which Should YOU Learn!
0:36
Nicholas Renotte
Рет қаралды 281 М.
Transformers (how LLMs work) explained visually | DL5
27:14
3Blue1Brown
Рет қаралды 4,3 МЛН
Understanding PyTorch Buffers
13:33
Sebastian Raschka
Рет қаралды 6 М.
Managing Sources of Randomness When Training Deep Neural Networks
23:11
Sebastian Raschka
Рет қаралды 2,5 М.
Attention in transformers, visually explained | DL6
26:10
3Blue1Brown
Рет қаралды 2 МЛН
Insights from Finetuning LLMs with Low-Rank Adaptation
13:49
Sebastian Raschka
Рет қаралды 7 М.
Diffusion models from scratch in PyTorch
30:54
DeepFindr
Рет қаралды 268 М.
How might LLMs store facts | DL7
22:43
3Blue1Brown
Рет қаралды 960 М.
UFC 287 : Перейра VS Адесанья 2
6:02
Setanta Sports UFC
Рет қаралды 486 М.