if anyone is wondering why the mixed derivates are the same: it's Schwarz's theorem.
@balajilakshminarayan15933 жыл бұрын
Grant
@davuluri4395Ай бұрын
thats 3blue1brown, wrong channel
@jyotsanabenpanchal727112 күн бұрын
Bruhh, The voice is of Grant!@@davuluri4395
@shcraft.36837 жыл бұрын
You have the same voice as the guy on 3blue1brown
@pressgreen6 жыл бұрын
@David Beyer Was looking for this comment. thanks lol
@SonLe-mk4sq4 жыл бұрын
I noticed that too... then I checked who he was.
@Luis-kd2te4 жыл бұрын
No wonder this made sense to me XD He is a blessing
@samiloom8565 Жыл бұрын
He is the sane guy
@danwigodsky26124 ай бұрын
Grant Sanderson worked for Khan Academy.
@therealbean43724 жыл бұрын
Hey Grant, Love your video's from Khan and 3Blue1Brown!
@haiarpyzargarian67144 жыл бұрын
Thank you so much, here I get much more infornation in one day than in university in a month)))
@debralegorreta13754 жыл бұрын
What does the Hessian matrix represent geometrically? In particular, what does the determinant of the Hessian matrix measure?
@lernenlernenlernen47074 жыл бұрын
That's really a good question! Sadly I can't answer it now, but I'll use it as an inspiration to look into it, when I have the time. I think the best strategy to approach this problem is to calculate the determinant for some 2 or 3 dimensional functions and then play around with different values for x,y and z.
@chiemxerxobi2 ай бұрын
pretty sure the determinant of anything kind of represents the dimension-specific area of said thing. So the determinant of the hessian matrix might have something to do with the area of the rate of the rate of change of that particular function. I might just be spitting some hot shi tho lol
@firstkaransingh Жыл бұрын
Awesome guy.... Mr Sanderson.
@AJ-et3vf2 жыл бұрын
Awesome video! Thank you! And wow! It's 3Blue1Brown's voice doing this video!
@MohamedJama-zt7tk Жыл бұрын
I love u Khan!! u save me today
@AJSLego6 жыл бұрын
3 blue one brown?
@AbhishekSachans6 жыл бұрын
Yes!! It's Grant only..
@BayesianBrain6 жыл бұрын
Can someone explain why the ideal learning rate for 2 or more dimensions in the gradient descent algorithm is the inverse of the Hessian (matrix of second partial derivatives)?
@proximalfuturism5 жыл бұрын
This guy explains it well: medium.com/@ranjeettate/learning-rate-in-gradient-descent-and-second-derivatives-632137dad3b5 . Intuitively, using the first derivative gives us the change in loss w.r.t x as a straight line; using second derivatives gives us information about the curvature of the loss function.
@user-mk3yl5fe4m3 жыл бұрын
So clear and helpful!
@siyuzhang78673 жыл бұрын
the moment I clicked on this link, oh this is the 3blue1brown guy!
@hussainbhavnagarwala2596 Жыл бұрын
what if the output of function f was a vector of 3 rows instead of a single expression. How would the hessian change?
@catouncormery29958 жыл бұрын
you are perfect, thanks for you videos, and you're fanny mood :)
@ConradOPrice6 жыл бұрын
0:00 - 0:06 sorry what? Don't think I've ever been confused so quickly in a tutorial.
@im-alida7 жыл бұрын
awesome! :-) i have a question what kind of tools are you using when you work?? I really wanna get that blackboard tool :-) thx in advance
@abdullahbinjahed69005 жыл бұрын
a math animating Python library ... i guess
@phuocsangnguyen64112 жыл бұрын
thank you so much ! so helpful
@anamitrasingha63624 жыл бұрын
So is this Hessian matrix is valid only for scalar valued functions right? If my intuition is correct then for a vector valued function of maybe 4 components, would there be 4 Hessian matrices?
@brandomiranda67036 жыл бұрын
When will the diagonal not be symmetric?
@ethanbooth11744 жыл бұрын
when the function is not continuous
@joluju23753 жыл бұрын
@@ethanbooth1174 Sure of that ? I mean, for the second mixed derivatives to be different, they have to exist.
@KaloqnBankov8 жыл бұрын
What if the function is a matrix itself? The Hessian matrix will be a matrix of matrices?
@SwaeTech7 жыл бұрын
...yep
@aldolunabueno26346 жыл бұрын
Tensor?
@valerianmp5 жыл бұрын
Albanovaphi7 just a block matrix
@MegaBdboy7 жыл бұрын
DUDE and why don't you tell me how to find extrema points with this !!!
@robertwilsoniii20486 жыл бұрын
Gamer Sparta You find the eigenvalues of this matrix after solving the differential equation that optimizes. Find out if this matrix is positive definite, negative definite or semi definite.
@evenamathias63552 ай бұрын
What is the point of finding H, why do we use it? Is it some sort of solution or something, i dont reallly get it
@gogl0l3866 жыл бұрын
Is there a vector form of the multivariable Taylor series?
@robertwilsoniii20482 жыл бұрын
I'm pretty sure that's the taylor expansion using the jacobian for derivatives.
@나누나누-h6t8 жыл бұрын
how do i know which channel of khan academy is for this video?
@emilybarnard4048 жыл бұрын
You can find this video on the Khan Academy website by using the search bar at the top of the screen and typing in "hessian." Here is the link: www.khanacademy.org/math/multivariable-calculus/applications-of-multivariable-derivatives/quadratic-approximations/v/the-hessian-matrix
@rubyemes Жыл бұрын
Sal, if I have 1 equation and 6 independent variables, my partial first derivatives is a vector with 6 terms. If I follow, the Hessian will be a 6x6 matrix. Is that correct? Thanks!!! I contribute to you as your program and platform makes an amazing contribution!
@jyly261 Жыл бұрын
I know it's not 3blue1Brown answering but you're right.
@rubyemes Жыл бұрын
@@jyly261 thanks for confirming
@mikhaelhalbar417 Жыл бұрын
Is this the 3blue1brown guy?
@wunanzeng70514 жыл бұрын
Do you guys know which lecture/series/playlist is this video from? Please let me know! Thanks!
@chaoticcubes49294 жыл бұрын
Wunan Zeng khan academy multivariable calculus playlist
@jermaineoneal1236 жыл бұрын
Thank you!
@ricosrealm Жыл бұрын
Is this 3blue1brown as the lecturer?
@moonisal6 жыл бұрын
고맙습니다
@subashsubashsubash6 жыл бұрын
Thank you sir, this video has given me a good idea
@David-xq3bg6 жыл бұрын
When are Fxy and Fyx not equal?
@matakos226 жыл бұрын
When F is not C2
@joluju23753 жыл бұрын
@@matakos22 Sorry, I don't know what C2 means ... do you have an example please ?
@matakos223 жыл бұрын
@@joluju2375 Continuously differentiable functions are sometimes said to be of class C1. A function is of class C2 if the first and second derivative of the function both exist and are continuous. More generally, a function is said to be of class Ck if the first k derivatives f′(x), f′′(x), ..., f (k)(x) all exist and are continuous. If derivatives f (n) exist for all positive integers n, the function is smooth or equivalently, of class C∞.
@joluju23753 жыл бұрын
@@matakos22 Thanks. So, for Fxy and Fyx not to be equal, they have to exist. Then, if F is not C2 and Fxy and Fyx exist, it means that Fxy or Fyx is not continuous. Right ?
@matakos223 жыл бұрын
@@joluju2375 Yes, or they could also be undefined
@shantanu_bhattacharya6 жыл бұрын
Good day, I was wondering whether you know any python library that has implemented second order gradient descent with hessian error matrix. If you can point me to the right direction, it would be very helpful. Thanks in advance, Kind regards Shantanu
@swapanjain8922 жыл бұрын
Jax
@mdehsanullahkhan14613 жыл бұрын
Thank you for the amazing explanation
@feynmath7 жыл бұрын
i think in place of hessian you actually have mentioned hessian transpose!
@michaelroditis19523 жыл бұрын
when can fxy!=fyx ?
@OverLordOfDa3rdWorld5 жыл бұрын
Wow, amazing. Thank you!
@engineeryadav73233 жыл бұрын
is he 3 blue 1 brown ???
@epicfudge98172 жыл бұрын
Is this the same guy as 3 blue 1 brown?
@abdulkadercerkezi14482 жыл бұрын
hey, you are 3blue1brown?
@zixiaoxu9776 жыл бұрын
interesting
@avibank7 жыл бұрын
Ah, so this is where the formula for the discrimination comes from.We can see that taking the determinant of the Hessian gives the formula for the discriminant.I know it works for R^2. Will verify for R^3 and R^n as an exercise.Thanks!
@oneiro14432 жыл бұрын
explain
@catsexe6932 Жыл бұрын
And the rest of the video?
@hamandresfr Жыл бұрын
The video is in fundraiser but the video is from 3b1b hehh? 🧐🧐
@hamzabelmengaa25046 жыл бұрын
What about fonction whitch have three variable 😩
@BayesianBrain6 жыл бұрын
He explains it at 4:24
@hamzabelmengaa25046 жыл бұрын
@@BayesianBrain thank you
@anamitrasingha63624 жыл бұрын
@@BayesianBrain what about a vector valued function?
@montagne21988 жыл бұрын
Came for Neo and Morpheus, left disappointed.
@AdanLoeraRuizАй бұрын
I dont get it
@aaryagohil70006 жыл бұрын
fitz?
@heylol336 жыл бұрын
Genius.
@paulolaranjeira9361 Жыл бұрын
Aren't you the guy of 3 Blue 1 Brown?
@maybeinactive Жыл бұрын
Grant???
@twistedlot6 жыл бұрын
in honor of Otto Hesse
@klam774 жыл бұрын
youre related?
@itskelvinn7 жыл бұрын
If you differntiate x first, then y, shouldnt it be "dxdy" ? why do you keep putting it backwards
@joshuat61247 жыл бұрын
No, the way he does it is notationally correct. Oh course, you are free to write things the way you like but he is following the convention.
@minalouisyassa7 жыл бұрын
Well you should think of it this way: d/dx (df/dy), so you take df/dy and then differentiate it with respect to x, so the video is correct. In other words we start with the partial derivative with respect to y and then differentiate it with respect to x.
@punaydang29485 жыл бұрын
becausr we move right to left in leibniz notation
@Eng_Hamza_kw2 жыл бұрын
👌👌👌👌
@samueljeromillson2 жыл бұрын
Dude I’m in 8th grade doing calc 1 and I already understand this.
@darkseeven5 жыл бұрын
go check 3b1b, sound just like you, he is a cool guy
@joel.ds.m5 жыл бұрын
It's the same guy 😂
@darkseeven5 жыл бұрын
Joel McAllister i know:))
@SubhenduMallick-lp1fo11 ай бұрын
Jay Shree Ram
@rishavjain50872 жыл бұрын
I thought i clicked 3b1b's video
@drewcarmichael17838 жыл бұрын
First
@tsunningwah3471Ай бұрын
dddd
@shayakoo13 жыл бұрын
Aye 3B1B
@tsunningwah3471Ай бұрын
黑人😊
@adamc54785 жыл бұрын
You dont explain the mixed derivative thing clearly, disliked.