The Fundamentals of Autograd

  Рет қаралды 60,500

PyTorch

PyTorch

Күн бұрын

Autograd is the automatic gradient computation framework used with PyTorch tensors to speed the backward pass during training. This video covers the fundamentals of Autograd, including: the advantages of runtime computations tracking, the role of Autograd in model training, how to determine when Autograd is and is not active; profiling with Autograd, and Autograd's high-level API.
Download the Autograd notebook here: pytorch-tutori...

Пікірлер: 18
@Sami_Wilf
@Sami_Wilf 3 жыл бұрын
This is one of the most eloquent and succinct videos on training in pytorch I've seen.
@kuskofboern
@kuskofboern 5 күн бұрын
Great tutorial! Just a small correction regarding the explanation of minimizing a function in the notebook. It is stated in the "## What Do We Need Autograd For?" section that "We want to *minimize* the loss, which means making its first derivative with respect to the input equal to 0: $\frac{\partial L}{\partial x} = 0$.", but this is only a necessary condition for a critical point, not a sufficient criterion for a minimum. For example, the function could have a saddle point or a maximum instead. While the explanation holds for strictly convex and differentiable functions, in general, a minimum is defined by the additional condition that the second derivative (or Hessian) is positive definite. Just thought it’d be helpful to clarify. Keep up the great work!
@HrishikeshMuruk
@HrishikeshMuruk 3 жыл бұрын
Nice set of videos. I just wish the audio volume as higher.
@altayyk
@altayyk 9 ай бұрын
Thank you so much for your clear explanations! Makes the API much easier to understand when the mathematical intuition is already there!
@FavourAkpasi
@FavourAkpasi 10 ай бұрын
very well explained series of videos. thank you. i however feel bad for your enter key.
@wayneqwele8847
@wayneqwele8847 2 жыл бұрын
awesome that was very succinct and clear.
@cagdastopcu659
@cagdastopcu659 2 жыл бұрын
This is the official doc video but with such a low quality audio?
@shivampadmani_iisc
@shivampadmani_iisc 5 ай бұрын
yes, because it's open source
@5104James
@5104James 2 жыл бұрын
11:33 Did you mean: the partial derivative of loss with respect to INPUTS (not learning weights)?
@ayushbachan6113
@ayushbachan6113 2 жыл бұрын
yes
@gebbione
@gebbione Жыл бұрын
After running zero_grad the grads are not zeroed but .grad is set to None type. Ie if i just print .grad[0] i get a NoneType object error Maybe change the last cell to ----------------- if model.layer2.weight.grad is not None: print(model.layer2.weight.grad[0][0:10]) for i in range(0, 5): prediction = model(some_input) loss = (ideal_output - prediction).pow(2).sum() loss.backward() print(model.layer2.weight.grad[0][0:10]) optimizer.zero_grad() print(model.layer2.weight.grad) if model.layer2.weight.grad is not None: print(model.layer2.weight.grad[0][0:10])
@atari1040
@atari1040 2 жыл бұрын
Oooooohhhh! So THAT'S how it works... cool! Thx!
@mominabbas125
@mominabbas125 3 жыл бұрын
Explained very well! (Y)
@myelinsheathxd
@myelinsheathxd 2 жыл бұрын
thx!
@mahdiamrollahi8456
@mahdiamrollahi8456 2 жыл бұрын
Can you fix the audio
@PurtiRS
@PurtiRS Жыл бұрын
Good video. Go softer on the keyboard, please.
@adnanhashem98
@adnanhashem98 5 ай бұрын
If you still think you don't understand Autograd, this video (kzbin.info/www/bejne/npvRh3-cq82BZrMsi=Y78dMifRIL_hJwyZ) walks through examples to calculate simple grads by hand and verify them using PyTorch. I understood from it more than any other video.
@udaynj
@udaynj Жыл бұрын
Video is blurry as hell
Building Models with PyTorch
14:09
PyTorch
Рет қаралды 65 М.
What is Automatic Differentiation?
14:25
Ari Seff
Рет қаралды 124 М.
VIP ACCESS
00:47
Natan por Aí
Рет қаралды 30 МЛН
Don’t Choose The Wrong Box 😱
00:41
Topper Guild
Рет қаралды 62 МЛН
How to treat Acne💉
00:31
ISSEI / いっせい
Рет қаралды 108 МЛН
Transformers (how LLMs work) explained visually | DL5
27:14
3Blue1Brown
Рет қаралды 4,7 МЛН
NEW TextGrad by Stanford: Better than DSPy
41:25
Discover AI
Рет қаралды 16 М.
Introduction to PyTorch Tensors
39:13
PyTorch
Рет қаралды 76 М.
But what is a neural network? | Deep learning chapter 1
18:40
3Blue1Brown
Рет қаралды 18 МЛН
PYTORCH COMMON MISTAKES - How To Save Time 🕒
19:12
Aladdin Persson
Рет қаралды 57 М.
MIT Introduction to Deep Learning | 6.S191
1:09:58
Alexander Amini
Рет қаралды 895 М.
PyTorch Gradients 101: A Beginner's Guide to the Basics
14:49
Ryan & Matt Data Science
Рет қаралды 1,5 М.
Introduction to PyTorch
23:33
PyTorch
Рет қаралды 262 М.
VIP ACCESS
00:47
Natan por Aí
Рет қаралды 30 МЛН