L6.2 Understanding Automatic Differentiation via Computation Graphs

  Рет қаралды 8,576

Sebastian Raschka

Sebastian Raschka

3 жыл бұрын

As previously mentioned, PyTorch can compute gradients automatically for us. In order to do that, it tracks computations via a computation graph, and then when it is time to compute the gradient, it moves backward along the computation graph. Actually, computations graphs are also a helpful concept for learning how differentiation (computing partial derivatives and gradients) work, which is what we are doing in this video.
Slides: sebastianraschka.com/pdf/lect...
-------
This video is part of my Introduction of Deep Learning course.
Next video: • L6.3 Automatic Differe...
The complete playlist: • Intro to Deep Learning...
A handy overview page with links to the materials: sebastianraschka.com/blog/202...
-------
If you want to be notified about future videos, please consider subscribing to my channel: / sebastianraschka

Пікірлер: 12
@karimelkabbaj
@karimelkabbaj Жыл бұрын
Thank you very much for this simplified explanation, i've been struggling to understand it until i found this master piece.
@SebastianRaschka
@SebastianRaschka Жыл бұрын
Nice, glad to hear that this was useful!!
@manuelkarner8746
@manuelkarner8746 2 жыл бұрын
thaaaank you, finaly I understand this perfectly (& can know repeat it for myself) explaining backpropagation my lovely proffs always said "then this is just the chainrule" & skipped any explanation for calculating (complicated) toy examples I knew the chainrule, but in the backprop context it was just to confusing ________Anway, got a question: at 12:23 you said tehcnicaly canceling the delta terms is not allowed -> could you elaborate on the math/why or point me to some ressourece explaining this ? Intuitively I always thought canceling delta´s is strange/unformal but I dont found out how this delta notation stuff fits into "normal" math notation :)
@SebastianRaschka
@SebastianRaschka 2 жыл бұрын
Nice, I am really glad to hear that! And yes, I totally agree. When I first learned it, it was also very confusing at first because the prof tried to brush it aside ("it's just calculus and the chain rule") just like you described!
@nak6608
@nak6608 9 ай бұрын
Love your textbooks and your videos. Thank you!
@mahmoodmohajer1677
@mahmoodmohajer1677 4 ай бұрын
thanks for pulling up this video.
@user-kw4kp7eq9m
@user-kw4kp7eq9m 10 ай бұрын
Thank you very much!
@Gzzzzzz111
@Gzzzzzz111 8 ай бұрын
YOU ARE GOATED!
@736939
@736939 2 жыл бұрын
17:27 In the formula on the top-left (as I understood) there is no sum, but stacking (or concatenating), then why should we add the results in different paths during the backward chain computation? Is it always work like this - just produce the sum in the chain when there is a concatenating????
@SebastianRaschka
@SebastianRaschka 2 жыл бұрын
Sorry if this was misleading. In the upper left corner, this was more like a function notation to highlight the function arguments. Like if you have a function L that computes x^2 + y^2, then it's basically like writing L(x, y) = x^2 + y^2. There is no concatenation. With the square brackets I meant to show that sigma_3 contains also function arguments. I just used square brackets (instead of round brackets) so it is easier to read, but now I can see how this can be confusing.
@736939
@736939 2 жыл бұрын
@@SebastianRaschka Thank you very much.
@Epistemophilos
@Epistemophilos Жыл бұрын
At 11:27 it gets confusing because you switch the terms around. Otherwise, very nice video.
L6.3 Automatic Differentiation in PyTorch -- Code Example
9:03
Sebastian Raschka
Рет қаралды 6 М.
6.1 Optimization Method - Automatic Differentiation
47:39
Julius Pfrommer
Рет қаралды 3,1 М.
Do you have a friend like this? 🤣#shorts
00:12
dednahype
Рет қаралды 55 МЛН
狼来了的故事你们听过吗?#天使 #小丑 #超人不会飞
00:42
超人不会飞
Рет қаралды 64 МЛН
Neural Networks 6 Computation Graphs and Backward Differentiation
10:31
From Languages to Information
Рет қаралды 23 М.
Intuition behind reverse mode algorithmic differentiation (AD)
13:17
What is Jacobian? | The right way of thinking derivatives and integrals
27:14
What is Automatic Differentiation?
14:25
Ari Seff
Рет қаралды 102 М.
Lecture 4 - Automatic Differentiation
1:03:35
Deep Learning Systems Course
Рет қаралды 14 М.
Intro to Gradient Descent || Optimizing High-Dimensional Equations
11:04
Dr. Trefor Bazett
Рет қаралды 59 М.
PyTorch Autograd Explained - In-depth Tutorial
13:42
Elliot Waite
Рет қаралды 99 М.
The Three Elements of PyTorch
56:58
Sebastian Raschka
Рет қаралды 4,8 М.
The Simple Essence of Automatic Differentiation - Conal Elliott
1:30:45
Microsoft Research
Рет қаралды 15 М.
How charged your battery?
0:14
V.A. show / Магика
Рет қаралды 3,5 МЛН
😱НОУТБУК СОСЕДКИ😱
0:30
OMG DEN
Рет қаралды 3,2 МЛН
How much charging is in your phone right now? 📱➡️ 🔋VS 🪫
0:11
Где раздвижные смартфоны ?
0:49
Не шарю!
Рет қаралды 498 М.
Карточка Зарядка 📱 ( @ArshSoni )
0:23
EpicShortsRussia
Рет қаралды 575 М.