Lightning Talk: Large-Scale Distributed Training with Dynamo and... - Yeounoh Chung & Jiewen Tan

  Рет қаралды 632

PyTorch

PyTorch

7 ай бұрын

Lightning Talk: Large-Scale Distributed Training with Dynamo and PyTorch/XLA SPMD - Yeounoh Chung & Jiewen Tan, Google
In this talk we cover PyTorch/XLA distributed API in relation with Torch.Dynamo. Specifically, we discuss the new PyTorch/XLA SPMD API for automatic parallelization and our latest LLaMA2 training results. PyTorch/XLA SPMD makes it simple for PyTorch developers to distribute their ML workloads (e.g., training & inference with Dynamo) with easy-to-use API, and uses XLA GSPMD, high-performance automatic parallelization system. Under the hood, it transforms the user single-device program into a partitioned one. We will share how we enabled advanced 2D sharding strategies for LLaMA2 using PyTorch/XLA SPMD.

Пікірлер: 1
@lilegend4382
@lilegend4382 5 күн бұрын
👍👍👍
小路飞姐姐居然让路飞小路飞都消失了#海贼王  #路飞
00:47
路飞与唐舞桐
Рет қаралды 93 МЛН
ХОТЯ БЫ КИНОДА 2 - официальный фильм
1:35:34
ХОТЯ БЫ В КИНО
Рет қаралды 2,2 МЛН
Кәріс тіріма өзі ?  | Synyptas 3 | 8 серия
24:47
kak budto
Рет қаралды 1,7 МЛН
Epoch, Batch, Batch Size, & Iterations
3:29
DeepNeuron
Рет қаралды 66 М.
Lightning Talk: Triton Compiler - Thomas Raoux, OpenAI
16:13
СҰЛТАН СҮЛЕЙМАНДАР | bayGUYS
24:46
bayGUYS
Рет қаралды 672 М.
Colgate mix Kar Diya 😱 #shorts
0:31
KK Super Arts
Рет қаралды 98 МЛН
How many pencils can hold me up?
0:40
A4
Рет қаралды 16 МЛН
Trying strange combos! 🤩 Radmiru #shorts
0:18
radmiru
Рет қаралды 9 МЛН