You have clearly brought out a geometrical interpretation of directional derivative.This has not been explained as clearly anywhere else in the videos or books.Well done .This is deep knowledge.
@seanki988 жыл бұрын
oh my god this is the 3blue1brown guy doing a khan acadmey video!!! life is complete :D
@BitcoinIsGoingToZero6 жыл бұрын
I originally went to 3b1b for multivariate videos, but he didnt have any. Then I came here, and was amped to here his voice.
@xoppa097 жыл бұрын
At the end of the video I think it makes sense to normalize the directional derivative, since the length of the vector is immaterial - what matters is the direction of the derivative. Here is another way to see this. Suppose we rotate the axes x and y , so that one of the x or y axes points in the direction of vector v (let's say we rotate the positive x axis so it coincides with v). It is true by algebra that ∂f/∂v = (∂f/∂v) / 1. Now if we make a 'nudge' of a unit step in the v direction (which is now the x direction), we get a change of ∂f/∂v in the z direction (along the tangent line). This is because (∂f/∂v) / 1 * 1 = ∂f/∂v , which was our goal all along. It is perhaps easier to see this if we draw a two dimensional graph, v on the horizontal axis and f on the vertical axis. if we draw a tangent line at the curve at point (a,b), we can draw a tiny right triangle with sides ∂f and ∂v . But by similar triangles you can draw a larger triangle with sides ∂f/∂v and 1. So it's better to only consider unit vectors v , we can denote this is a unit vector by inserting a little hat above v , like in *p̂* .
@LaureanoLuna5 жыл бұрын
As it is a DIRECTIONAL derivative, it should only depend on the direction, not on the magnitude, hence the vector should always be a unit vector.
@MachineMind5174 жыл бұрын
Great explanation. When you already understand this, gradient descent algorithm becomes easier.
@curiousman18124 жыл бұрын
Yes, and when you grasp the gradient concept, the conjugate gradient method becomes a piece of cake :)
@MachineMind5174 жыл бұрын
@@curiousman1812 Definitely..
@bakhtawarnizamani66184 жыл бұрын
Exactly that's what i'm here for
@sayitsilence91357 жыл бұрын
what does the mean in the directional derivative definition? Isn't the specific vector enough?
@nissinsobaw30742 жыл бұрын
i smiled when i heard the voice and instantly knew i was going to understant the subject
@HiuYatLam-mv5wv9 ай бұрын
me too
@2000mroliver5 жыл бұрын
so directional derivative is basically a combination of partial derivatives?
@maaan84946 жыл бұрын
This guy is really good tbh
@xoppa097 жыл бұрын
i love how you make the ideas of calculus transparent and approachable.
@liabraga46417 жыл бұрын
Great explanation as usual, thanks!!
@LaureanoLuna6 жыл бұрын
Reminding of how vector addition works could be useful to explain why adding hv to a drives the input a small amount along the vector v.
@scholar-mj3om Жыл бұрын
Marvellous💯
@danielgv15267 жыл бұрын
The type of functions described in this video have their domain living in R^n and their codomain living in R. If we think about it, it's like going back to one-variable calculus, because we focus just on the full line a+hv that lives in R^n. That is just one direction, like the x axis, and that's the way it should be seen. Also, if we choose v such that its norm is 1, things simplify a lot. That denominator |hv| that you mention at the end, turns into |h| (that is the justification of most authors for using unitary vectors). The final form lim as h->0 of (f(a+hv)-f(a))/h is way too similar to derivatives in one dimension!
@desitrump8 ай бұрын
Thank you so much, love you ❤
@johnhippisley91063 жыл бұрын
Ahah! This is why we require the magnitude of the vector with which we are taking the directional derivative to be 1! We are merely stepping in the *direction* of that vector; just like in single-variable calculus, we are trying to calculate the rate of change per *unit* increase.
@Postermaestro7 жыл бұрын
so nice
@danieljaszczyszczykoeczews26164 жыл бұрын
thanks for a video
@flower-nw4df4 жыл бұрын
Yh it's very nice, I like it 💕
@priyankkharat74075 жыл бұрын
I got the intuition behind this formal defination. But have this doubt : Will the directional derivative / Parital derivative function f(vecA) op a vector (each element being partial change in f with respect to change in that dimension like gradient) or sum of all the elements in that resulting vector (intuitively thinking, which will be dot product of vecA and gradient as shown in previous video).
@jonathanlemaitre66764 жыл бұрын
Hello i'm a student thank you a lot, it help me a lot. For the last formula could the denominator be : h * norm of vector v ?
@josephwheelerton8 жыл бұрын
So I tried writing my own intuitive definition behind the directional derivative. I came across a problem, though. I wrote something like, "the change in F (output) resulting from a change in the direction of V is equal to the sum of the changes in F resulting from each of the components of V." There is something wrong there. Because derivatives as I understand them are infinitesimally small quantities in a *ratio*... so to say that the derivative is "the resulting change" doesn't quite do justice to the ratio part. Maybe it should be something like "the ratio between the change in the output and the change in the input. Can someone help with this? (also, if the derivative is a ratio, then why does scaling V by 2, which then scales the resulting change in F by 2, give a different ratio? As I am understanding this, the scalar 2 should 'cancel out' in the ratio).
@LaureanoLuna6 жыл бұрын
The derivative always involves the ratio of the change in the ouput to the change in the input: it is its limit when the latter tends to 0. The directional derivative is that limit computed when the input changes along the direction of the vector in question. Finally, of course, multiplying the numerator by 2 will double the ratio.
@andrei-un3yr4 жыл бұрын
There's something that's not really clicking for me: if h approaches 0, isn't h * v same as 2 * h * v? I mean doesn't something infinitely small multiplied by any number give something also infinitely small?
@johnmccrae524 жыл бұрын
Yes, but you divide the whole thing by h, scaling it back up.
@edmel1444 жыл бұрын
What I think John means is, when you actually calculate the amounts in the limit to get a specific derivative the ‘h’ in the denominator and numerator have to factor out, or elements where they don’t factor out go to zero, otherwise the limit will go to infinity. Because they factor out of the numerator, multiplying by 2 multiplies the numerator, and hence the resultant fraction i.e. derivative, by two; while the denominator stays constant.
@Raikaska7 жыл бұрын
Shouldn't we, in df/d(v1,v2) evaluated in (a,b), divide by ||(a,b) + h(v1,v2) ; (a,b)|| ? Thanks
@xoppa097 жыл бұрын
i dont think so, because the numerator of that limit , lim {h →0} [f( *a* +h *i* ) - f( *a* )]/h is not a vector, but a number. But as he mentioned in the end of the video , when we find the derivative in the direction of *v* , we could get different answers depending on whether you are looking at *2v* which is in the same direction as *v* . To solve this problem of getting different answers for the directional derivative, we can normalize the expression by dividing by the length of *v* . lim {h →0} [f( *a* +h *v* ) - f( *a* )]/(h | *v* | ) Normalizing makes sense because if you rotate the axes so that one of the x or y axes points in the direction of *v* , we want the 'nudge' of the independent variable to be a unit step, since by algebra it is true that ∂f/∂v =(∂f/∂v) / ( ∂v/∂v) = (∂f/∂v) / 1. Then moving one unit step in the *v* direction along the tangent makes us move a height of ∂f/∂v. So it's better to only consider unit vectors *v* , we can denote this is a unit vector by inserting a little hat above *v* , like in p̂.
@whatislove45874 жыл бұрын
Why Delta f = Delta f times v?
@rickmook94226 жыл бұрын
Okay so I understand this now but I have no idea what the big picture is or what I can accomplish with it...
@xXxIMMORTALxXx4 жыл бұрын
This directional derivative gives a foundation for gradient (the vector that tells you the direction to the steepest ascent). Gradient is used a lot for optimization process as in Machine learning for example.
@startuplyf15038 жыл бұрын
There is just not enough examples. Please just do more examples in future, because No matter how well you think you can explain it, Sal succeeded in teaching me with tonnes of examples. My university lecturers do the same thing and I just can't understand it. Thanks
@yadpreetcheema48066 жыл бұрын
patrickjmt does lots of examples, you should check him out if you haven't seen his videos before.
@Sneaky1ne5 жыл бұрын
You probably already graduated, but for future readers , I actually make my own examples and try plugging numbers into the definition and find that it helps me a lot.
@DiegoMathemagician5 жыл бұрын
Why the denominator isn't ||v||*h ? It makes sense because it is the actual change in the input space.
@DiegoMathemagician5 жыл бұрын
@poewhite but do you know the reason of defining v as unitary?
@jonathanlemaitre66764 жыл бұрын
I have the same question, did you find an answer to it ?