The Hessian matrix | Multivariable calculus | Khan Academy

  Рет қаралды 456,344

Khan Academy

Khan Academy

Күн бұрын

Пікірлер: 104
@francescomura3228
@francescomura3228 2 жыл бұрын
if anyone is wondering why the mixed derivates are the same: it's Schwarz's theorem.
@balajilakshminarayan1593
@balajilakshminarayan1593 3 жыл бұрын
Grant
@davuluri4395
@davuluri4395 Ай бұрын
thats 3blue1brown, wrong channel
@jyotsanabenpanchal7271
@jyotsanabenpanchal7271 12 күн бұрын
Bruhh, The voice ​is of Grant!@@davuluri4395
@shcraft.3683
@shcraft.3683 7 жыл бұрын
You have the same voice as the guy on 3blue1brown
@pressgreen
@pressgreen 6 жыл бұрын
@David Beyer Was looking for this comment. thanks lol
@SonLe-mk4sq
@SonLe-mk4sq 4 жыл бұрын
I noticed that too... then I checked who he was.
@Luis-kd2te
@Luis-kd2te 4 жыл бұрын
No wonder this made sense to me XD He is a blessing
@samiloom8565
@samiloom8565 Жыл бұрын
He is the sane guy
@danwigodsky2612
@danwigodsky2612 4 ай бұрын
Grant Sanderson worked for Khan Academy.
@therealbean4372
@therealbean4372 4 жыл бұрын
Hey Grant, Love your video's from Khan and 3Blue1Brown!
@haiarpyzargarian6714
@haiarpyzargarian6714 4 жыл бұрын
Thank you so much, here I get much more infornation in one day than in university in a month)))
@debralegorreta1375
@debralegorreta1375 4 жыл бұрын
What does the Hessian matrix represent geometrically? In particular, what does the determinant of the Hessian matrix measure?
@lernenlernenlernen4707
@lernenlernenlernen4707 4 жыл бұрын
That's really a good question! Sadly I can't answer it now, but I'll use it as an inspiration to look into it, when I have the time. I think the best strategy to approach this problem is to calculate the determinant for some 2 or 3 dimensional functions and then play around with different values for x,y and z.
@chiemxerxobi
@chiemxerxobi 2 ай бұрын
pretty sure the determinant of anything kind of represents the dimension-specific area of said thing. So the determinant of the hessian matrix might have something to do with the area of the rate of the rate of change of that particular function. I might just be spitting some hot shi tho lol
@firstkaransingh
@firstkaransingh Жыл бұрын
Awesome guy.... Mr Sanderson.
@AJ-et3vf
@AJ-et3vf 2 жыл бұрын
Awesome video! Thank you! And wow! It's 3Blue1Brown's voice doing this video!
@MohamedJama-zt7tk
@MohamedJama-zt7tk Жыл бұрын
I love u Khan!! u save me today
@AJSLego
@AJSLego 6 жыл бұрын
3 blue one brown?
@AbhishekSachans
@AbhishekSachans 6 жыл бұрын
Yes!! It's Grant only..
@BayesianBrain
@BayesianBrain 6 жыл бұрын
Can someone explain why the ideal learning rate for 2 or more dimensions in the gradient descent algorithm is the inverse of the Hessian (matrix of second partial derivatives)?
@proximalfuturism
@proximalfuturism 5 жыл бұрын
This guy explains it well: medium.com/@ranjeettate/learning-rate-in-gradient-descent-and-second-derivatives-632137dad3b5 . Intuitively, using the first derivative gives us the change in loss w.r.t x as a straight line; using second derivatives gives us information about the curvature of the loss function.
@user-mk3yl5fe4m
@user-mk3yl5fe4m 3 жыл бұрын
So clear and helpful!
@siyuzhang7867
@siyuzhang7867 3 жыл бұрын
the moment I clicked on this link, oh this is the 3blue1brown guy!
@hussainbhavnagarwala2596
@hussainbhavnagarwala2596 Жыл бұрын
what if the output of function f was a vector of 3 rows instead of a single expression. How would the hessian change?
@catouncormery2995
@catouncormery2995 8 жыл бұрын
you are perfect, thanks for you videos, and you're fanny mood :)
@ConradOPrice
@ConradOPrice 6 жыл бұрын
0:00 - 0:06 sorry what? Don't think I've ever been confused so quickly in a tutorial.
@im-alida
@im-alida 7 жыл бұрын
awesome! :-) i have a question what kind of tools are you using when you work?? I really wanna get that blackboard tool :-) thx in advance
@abdullahbinjahed6900
@abdullahbinjahed6900 5 жыл бұрын
a math animating Python library ... i guess
@phuocsangnguyen6411
@phuocsangnguyen6411 2 жыл бұрын
thank you so much ! so helpful
@anamitrasingha6362
@anamitrasingha6362 4 жыл бұрын
So is this Hessian matrix is valid only for scalar valued functions right? If my intuition is correct then for a vector valued function of maybe 4 components, would there be 4 Hessian matrices?
@brandomiranda6703
@brandomiranda6703 6 жыл бұрын
When will the diagonal not be symmetric?
@ethanbooth1174
@ethanbooth1174 4 жыл бұрын
when the function is not continuous
@joluju2375
@joluju2375 3 жыл бұрын
@@ethanbooth1174 Sure of that ? I mean, for the second mixed derivatives to be different, they have to exist.
@KaloqnBankov
@KaloqnBankov 8 жыл бұрын
What if the function is a matrix itself? The Hessian matrix will be a matrix of matrices?
@SwaeTech
@SwaeTech 7 жыл бұрын
...yep
@aldolunabueno2634
@aldolunabueno2634 6 жыл бұрын
Tensor?
@valerianmp
@valerianmp 5 жыл бұрын
Albanovaphi7 just a block matrix
@MegaBdboy
@MegaBdboy 7 жыл бұрын
DUDE and why don't you tell me how to find extrema points with this !!!
@robertwilsoniii2048
@robertwilsoniii2048 6 жыл бұрын
Gamer Sparta You find the eigenvalues of this matrix after solving the differential equation that optimizes. Find out if this matrix is positive definite, negative definite or semi definite.
@evenamathias6355
@evenamathias6355 2 ай бұрын
What is the point of finding H, why do we use it? Is it some sort of solution or something, i dont reallly get it
@gogl0l386
@gogl0l386 6 жыл бұрын
Is there a vector form of the multivariable Taylor series?
@robertwilsoniii2048
@robertwilsoniii2048 2 жыл бұрын
I'm pretty sure that's the taylor expansion using the jacobian for derivatives.
@나누나누-h6t
@나누나누-h6t 8 жыл бұрын
how do i know which channel of khan academy is for this video?
@emilybarnard404
@emilybarnard404 8 жыл бұрын
You can find this video on the Khan Academy website by using the search bar at the top of the screen and typing in "hessian." Here is the link: www.khanacademy.org/math/multivariable-calculus/applications-of-multivariable-derivatives/quadratic-approximations/v/the-hessian-matrix
@rubyemes
@rubyemes Жыл бұрын
Sal, if I have 1 equation and 6 independent variables, my partial first derivatives is a vector with 6 terms. If I follow, the Hessian will be a 6x6 matrix. Is that correct? Thanks!!! I contribute to you as your program and platform makes an amazing contribution!
@jyly261
@jyly261 Жыл бұрын
I know it's not 3blue1Brown answering but you're right.
@rubyemes
@rubyemes Жыл бұрын
@@jyly261 thanks for confirming
@mikhaelhalbar417
@mikhaelhalbar417 Жыл бұрын
Is this the 3blue1brown guy?
@wunanzeng7051
@wunanzeng7051 4 жыл бұрын
Do you guys know which lecture/series/playlist is this video from? Please let me know! Thanks!
@chaoticcubes4929
@chaoticcubes4929 4 жыл бұрын
Wunan Zeng khan academy multivariable calculus playlist
@jermaineoneal123
@jermaineoneal123 6 жыл бұрын
Thank you!
@ricosrealm
@ricosrealm Жыл бұрын
Is this 3blue1brown as the lecturer?
@moonisal
@moonisal 6 жыл бұрын
고맙습니다
@subashsubashsubash
@subashsubashsubash 6 жыл бұрын
Thank you sir, this video has given me a good idea
@David-xq3bg
@David-xq3bg 6 жыл бұрын
When are Fxy and Fyx not equal?
@matakos22
@matakos22 6 жыл бұрын
When F is not C2
@joluju2375
@joluju2375 3 жыл бұрын
@@matakos22 Sorry, I don't know what C2 means ... do you have an example please ?
@matakos22
@matakos22 3 жыл бұрын
@@joluju2375 Continuously differentiable functions are sometimes said to be of class C1. A function is of class C2 if the first and second derivative of the function both exist and are continuous. More generally, a function is said to be of class Ck if the first k derivatives f′(x), f′′(x), ..., f (k)(x) all exist and are continuous. If derivatives f (n) exist for all positive integers n, the function is smooth or equivalently, of class C∞.
@joluju2375
@joluju2375 3 жыл бұрын
@@matakos22 Thanks. So, for Fxy and Fyx not to be equal, they have to exist. Then, if F is not C2 and Fxy and Fyx exist, it means that Fxy or Fyx is not continuous. Right ?
@matakos22
@matakos22 3 жыл бұрын
@@joluju2375 Yes, or they could also be undefined
@shantanu_bhattacharya
@shantanu_bhattacharya 6 жыл бұрын
Good day, I was wondering whether you know any python library that has implemented second order gradient descent with hessian error matrix. If you can point me to the right direction, it would be very helpful. Thanks in advance, Kind regards Shantanu
@swapanjain892
@swapanjain892 2 жыл бұрын
Jax
@mdehsanullahkhan1461
@mdehsanullahkhan1461 3 жыл бұрын
Thank you for the amazing explanation
@feynmath
@feynmath 7 жыл бұрын
i think in place of hessian you actually have mentioned hessian transpose!
@michaelroditis1952
@michaelroditis1952 3 жыл бұрын
when can fxy!=fyx ?
@OverLordOfDa3rdWorld
@OverLordOfDa3rdWorld 5 жыл бұрын
Wow, amazing. Thank you!
@engineeryadav7323
@engineeryadav7323 3 жыл бұрын
is he 3 blue 1 brown ???
@epicfudge9817
@epicfudge9817 2 жыл бұрын
Is this the same guy as 3 blue 1 brown?
@abdulkadercerkezi1448
@abdulkadercerkezi1448 2 жыл бұрын
hey, you are 3blue1brown?
@zixiaoxu977
@zixiaoxu977 6 жыл бұрын
interesting
@avibank
@avibank 7 жыл бұрын
Ah, so this is where the formula for the discrimination comes from.We can see that taking the determinant of the Hessian gives the formula for the discriminant.I know it works for R^2. Will verify for R^3 and R^n as an exercise.Thanks!
@oneiro1443
@oneiro1443 2 жыл бұрын
explain
@catsexe6932
@catsexe6932 Жыл бұрын
And the rest of the video?
@hamandresfr
@hamandresfr Жыл бұрын
The video is in fundraiser but the video is from 3b1b hehh? 🧐🧐
@hamzabelmengaa2504
@hamzabelmengaa2504 6 жыл бұрын
What about fonction whitch have three variable 😩
@BayesianBrain
@BayesianBrain 6 жыл бұрын
He explains it at 4:24
@hamzabelmengaa2504
@hamzabelmengaa2504 6 жыл бұрын
@@BayesianBrain thank you
@anamitrasingha6362
@anamitrasingha6362 4 жыл бұрын
@@BayesianBrain what about a vector valued function?
@montagne2198
@montagne2198 8 жыл бұрын
Came for Neo and Morpheus, left disappointed.
@AdanLoeraRuiz
@AdanLoeraRuiz Ай бұрын
I dont get it
@aaryagohil7000
@aaryagohil7000 6 жыл бұрын
fitz?
@heylol33
@heylol33 6 жыл бұрын
Genius.
@paulolaranjeira9361
@paulolaranjeira9361 Жыл бұрын
Aren't you the guy of 3 Blue 1 Brown?
@maybeinactive
@maybeinactive Жыл бұрын
Grant???
@twistedlot
@twistedlot 6 жыл бұрын
in honor of Otto Hesse
@klam77
@klam77 4 жыл бұрын
youre related?
@itskelvinn
@itskelvinn 7 жыл бұрын
If you differntiate x first, then y, shouldnt it be "dxdy" ? why do you keep putting it backwards
@joshuat6124
@joshuat6124 7 жыл бұрын
No, the way he does it is notationally correct. Oh course, you are free to write things the way you like but he is following the convention.
@minalouisyassa
@minalouisyassa 7 жыл бұрын
Well you should think of it this way: d/dx (df/dy), so you take df/dy and then differentiate it with respect to x, so the video is correct. In other words we start with the partial derivative with respect to y and then differentiate it with respect to x.
@punaydang2948
@punaydang2948 5 жыл бұрын
becausr we move right to left in leibniz notation
@Eng_Hamza_kw
@Eng_Hamza_kw 2 жыл бұрын
👌👌👌👌
@samueljeromillson
@samueljeromillson 2 жыл бұрын
Dude I’m in 8th grade doing calc 1 and I already understand this.
@darkseeven
@darkseeven 5 жыл бұрын
go check 3b1b, sound just like you, he is a cool guy
@joel.ds.m
@joel.ds.m 5 жыл бұрын
It's the same guy 😂
@darkseeven
@darkseeven 5 жыл бұрын
Joel McAllister i know:))
@SubhenduMallick-lp1fo
@SubhenduMallick-lp1fo 11 ай бұрын
Jay Shree Ram
@rishavjain5087
@rishavjain5087 2 жыл бұрын
I thought i clicked 3b1b's video
@drewcarmichael1783
@drewcarmichael1783 8 жыл бұрын
First
@tsunningwah3471
@tsunningwah3471 Ай бұрын
dddd
@shayakoo1
@shayakoo1 3 жыл бұрын
Aye 3B1B
@tsunningwah3471
@tsunningwah3471 Ай бұрын
黑人😊
@adamc5478
@adamc5478 5 жыл бұрын
You dont explain the mixed derivative thing clearly, disliked.
@JensenPlaysMC
@JensenPlaysMC 4 жыл бұрын
because there are other videos dedicated to this.
Expressing a quadratic form with a matrix
8:20
Khan Academy
Рет қаралды 383 М.
What is Jacobian? | The right way of thinking derivatives and integrals
27:14
It’s all not real
00:15
V.A. show / Магика
Рет қаралды 20 МЛН
Multi-variable Optimization & the Second Derivative Test
13:36
Dr. Trefor Bazett
Рет қаралды 102 М.
The Lagrangian
12:28
Khan Academy
Рет қаралды 494 М.
Visually Explained: Newton's Method in Optimization
11:26
Visually Explained
Рет қаралды 112 М.
Derivative of a Matrix : Data Science Basics
13:43
ritvikmath
Рет қаралды 402 М.
Derivatives,Gradient,Hessian,Jacobian,Taylor Series
21:33
NPTEL-NOC IITM
Рет қаралды 49 М.
What is 0 to the power of 0?
14:22
Eddie Woo
Рет қаралды 10 МЛН
Gradient
5:31
Khan Academy
Рет қаралды 948 М.
It’s all not real
00:15
V.A. show / Магика
Рет қаралды 20 МЛН