THIS MAKES SO MUCH SENSE!! Thank you so much for explaining this more clearly in a few minutes than my textbook could do in a few hours!
@Borey5678 жыл бұрын
I think this small video worth few 2hrs lectures in a university.
@dragosmanailoiu95445 жыл бұрын
lmao true
@123456789arty4 жыл бұрын
I just watched a 1 hour lecture about Cramer-Rao Lower Bound and you are totally right :P this was waaay more informative.
@jaymei25325 жыл бұрын
Like everyone else said, very well explained. I feel way less jittery about this whole entire concept. Thank you in 2019!
@oscarlu99194 жыл бұрын
This explanation is excellent. It is crystal clear to explain why is the inverse relationship between variance and second derivative, and why is second derivation, and plus why it is negative! Bravo, Prof.Ben!
@yukew41064 жыл бұрын
Hi Mr. Lambert, I just want to take a moment to thank you for taking the time to make these videos on KZbin. They are very easy to understand and by watching your videos I have been able to understand my statistical theory and bayesian statistics courses more as an undergrad. Thanks a lot and I wish you all the best!
@satltabur45973 жыл бұрын
In 7m and 59s you explained it better and more clearly than many 2h university lectures combined.
@LongyZ1310 жыл бұрын
Really appreciate videos like this where the aim is to provide an intuitive explanation of the concepts as opposed to going into detail on the maths behind them. Thanks.
@HappehLlama10 жыл бұрын
This was a fantastic intuitive explanation - thank you!
@accountname10472 жыл бұрын
This was my intuition when studying ML estimators in statistics, but never got a straight answer about it from my teachers. Happy to see others think of it through a geometric lens! Great video
@andrewedson70104 жыл бұрын
Studying for actuarial exams and the material just throws Fisher Information at you with no context. This will help me understand exactly what we are expected to do in the calculations. Thank you
@Manny123-y3j3 жыл бұрын
Damn. You explained this so well. I never have any idea what my professor is talking about, but videos like this help SO MUCH. Thank you!
@ishaansingh1789 Жыл бұрын
Beautifully explained my friend- intuition is almost always as important as the actual proof itself
@tomthefall2 жыл бұрын
this is the best video ive seen on this topic, very well done
@irocmath97275 жыл бұрын
Wow! This clarifies a good week or two from last year's lectures. I wish I had seen these videos when I was taking the course last year.
@johannaw2031 Жыл бұрын
This video makes me very clear about one thing, that I find it strange how hard it obviously is for professors to provide some clear intuition. Why must it be so hard to be pedagogical when you really know something, which I expect a professor does. This is a working day of headache over horrible handouts made understandable in 5 mins.
@mehinabbasova50139 ай бұрын
This makes so much more sense now, thank you!
@Byc8454 жыл бұрын
The point of view in curvature is soooo great!
@cecicheng57919 жыл бұрын
wow finally get the idea about this relationship between covariance matrix and hessian
@wahabfiles62604 жыл бұрын
so in otherowords the covariance matrix is hessian of maximum likellihood?
@MrYahya01013 жыл бұрын
you said we add the negative sign, because the second derivative is negative after a certain value, and the negative sign is added to correct for that negative. what about when the second derivative is positive? doesn't the negative sign make the second derivative negative then? of what use will that be?
@SAGEmania-q8s5 ай бұрын
Thank you so much. This explains so much.
@jubachoomba3 жыл бұрын
Those tangents illustrate the convexity... Jensen!
@michaelmalone76144 жыл бұрын
Wow, that makes things so much clearer. Thank you.
@leza75842 жыл бұрын
This helps so much. very simple explanation
@unnatishukla85132 жыл бұрын
Awesome awesome awesome video....Thankyou so much!
@kimchi_taco5 жыл бұрын
Kudos man! most intuitive explanation ever!
@fengdai23044 жыл бұрын
Ben, you are amazing!
@Trubripes7 ай бұрын
High curvature -> sharp -> concentrated -> low variance. Makes sense.
@johanjjager3 жыл бұрын
Isn't the variance of theta hat also dependent on n, the number of observations which constitute the likelihood function?
@1024Maverick7 жыл бұрын
You just saved my semester (again) GGWP
@jorgebretonessantamarina188 жыл бұрын
Wonderful video. Thank you very much!
@achillesarmstrong96396 жыл бұрын
OK 3 months ago, I thought I understood this video. After I learned more statistic. Now I understand what is going on. I didn't quite understand the concept 3 months ago.
@vitorjung4 жыл бұрын
Excellent video, congratulations!
@madhurasutar53324 жыл бұрын
Well explained man!!! Thanks a million 🙏
@coopernfsps8 жыл бұрын
Great video, as always. Helped me out a lot!
@aartisingh1387 Жыл бұрын
Thank you for this video. I have watched this video many times over the years. The simplicity, intuition, visuals, clarity, and ease, are nothing less than brilliant. It has always helped whenever things get fuzzy. Just a small request or a question if you may: Calling vertical axis "likelihood of the data" makes it a bit confusing! Instead, should it not be "likelihood of the parameter" that is L( theta; data). And this "likelihood of the parameter" then happens to be equivalent to f(data|theta)? So, y axis should not be called L(data|theta)?
@pumpkinwang5484 жыл бұрын
Thank u Ben, it was quite helpful
@filipposchristou4417 жыл бұрын
thanks. Good explanation. I guess you saved me hours of searching.
@davidpaganin33616 жыл бұрын
Many thanks, much appreciated!
@hankyang74665 жыл бұрын
wonderful video, thank you!
@charlesity7 жыл бұрын
Thank you very much!
@wildboar31708 жыл бұрын
Hi Ben find your tutorials very easy to follow- thanks. What software are you using? Especially like the coloured pens on black background.
@atfirstiamhuman91836 жыл бұрын
i dont know hat he is using but I sometimes use app.liveboard.online/ . It also allows you to chose different backgrounds for a board and different colors and to livestream your drawing from your tablet/smartphone to PC which i often use as it is better to draw by hand/pen then by mouse.
@lastua85624 жыл бұрын
You can check his website for info.
@alecvan71435 жыл бұрын
Awesome video!!
@Ekskwkwkwkw23093 жыл бұрын
In wich playlist ı can find this topics in a ordered manner
@flo60336 жыл бұрын
Thanks, very intuitive. [Subscribed]
@icosum9 жыл бұрын
Excellent many thanks
@lucystruthers78763 жыл бұрын
Hi ben, thank you so much for your videos, i am studying quantitative ecology and do not have a strong mathematical background - your lessons really help! May I ask how the different values of theta are generated (along the x axis)? I assume the MLE expression stays constant and that the parameter estimates vary due to sample variation but in my case I only have one sample. I am a bit confused whether variance of the MLE is actually referring to variance in the parameter estimate due to sampling error. Secondly, in order to calculate the variance, must the 2nd derivative be evaluated for the value of theta which gives the MLE? I hope these questions make sense!
@achillesarmstrong96396 жыл бұрын
wonderful video
@nikhiln98875 жыл бұрын
great intuitive :)
@archangel54374 жыл бұрын
You da best!
@JanM3515313515 жыл бұрын
Very good.
@samah2418 жыл бұрын
I want to know the meaning of penalized mle
@lastua85624 жыл бұрын
Are you learning that for Machine Learning?
@Adam-de8yi10 ай бұрын
My student finance payment should be going to people like you, not these institutions.